I have a growing collection of e-books thanks to my mom getting me my first e-reading devices roughly seven years ago–since Christmas of 2009 I’ve owned an e-reader. My first, the Nook 1st Edition was easy to manage, I kept everything on one device and that was it. Today, I read e-books across devices–moving from a Kindle Paper White to my phone (Nexus 6p), to my NVIDIA Sheild, to my Surface Pro, and then maybe reading on the Desktop every once in a while.
The problem with reading across devices is two-fold, 1) how do I manage my ebooks in a central location but access them everywhere? and 2) how much can I sync that across devices?
The solution for the first problem was Calibre. With Calibre you can manage your library from across sources and add metadata to them. Calibre can also upload books to your devices locally or it can host a content server.
The content server, when paired with a Dynamic DNS, a static IP for the computer hosting the server, and some port forwarding on your home router allows you to get access to your books on any device anywhere.
My ASUS Router allows me to use their asuscomm Dynamic DNS. I also setup a static IP for my HTPC that hosts the content server and picked a port (not 80 since that us HTTP) and forwarded it on the router. Calibre allows you to set up a username and password, though this is not always supported by the apps you’ll use, to protect yourself from others trying to get to your files.
The second problem is maybe just one of convenience for picking up where you left off any time you switch devices. For that I upload books to Google Play Books. Google Play Books lets you upload an e-book and then read it across devices (supposing you can use the Google Play Books app or website on that device). The major benefit to this is that I usually have access to at least one of my devices with Google Play Books on it.
Once you’ve got your books in Calibre you can mange them all from the Calibre app while accessing them anywhere. Meanwhile you can upload to and (optionally) delete books from Google Play Books which will sync your reading progress across devices.
One of my new year’s resolutions (besides updating my blog weekly) has been to learn more ‘stuff.’ My wife would probably laugh at that statement since 90% of our conversations revolve around some new, really cool, thing I want to tell her about.
In the interest of sharing, I figured I’d make a list (looking at you, Buzzfeed) of the podcasts I’ve been listening to for the past month and my thoughts. Podcasts are presented from least likely to skip an episode to most likely.
Planet Money – an excellent ‘economics’ podcast, interesting topics and great voices/sound quality. I always find myself engrossed in Planet Money.
The Pollsters – a recent addition, I still get urges to look at polling thanks to my time as Political Science/Pre-law major at YSU. This is really PC and they talk about methodology which is fantastic.
Tested (Adam Savage) – excellent conversation about a wide variety of things, great background listening while toiling at my desk at home.
Freakonomics – these used to be better, and to be honest I sometimes consider skipping them now, the insight of Freakonomics brought to other interesting topics. Its hit or miss whether its interesting or whether the insight is “worth it.”
Serial – the internet sensation makes the list. Season One was excellent, season two is good so far, and with a relaxed release schedule it’ll be even harder to justify a skip.
Hidden Brain – another podcast talking about behavioral and psychological trends I never knew about. The key here is that I find the host’s dictation and voice to be non-grating unlike the next several podcasts.
Stuff You Should Know– interesting podcast about things I generally want to know more about, if I don’t I skip it. The hosts voices sometimes wear on me though.
Where There’s Smoke– a self-development podcast, I usually skip around their archives until I see something I want to know more about. Mostly hit or miss, but still worth a look.
Completely (Optional) Knowledge– run by Green Peace, surprisingly, its “the show that answers the questions you never knew you had.” Last episode I listened to was about the longest someone lived underwater by way of anecdote about underwater tea-parties into a story about living in a ‘long tube’ under the ocean. Interesting and well produced.
What’s the Point– 538’s ‘latest’ podcast, this is like listening in to a discussion about how data is influencing just about everything. Host is great, I believe he’s worked on a few other economics related podcasts, but I skip around depending on the topic.
This American Life– I, like many people, went through a love affair with TAL. However, after sticking with it for several years I just find myself less infatuated with the show now. I skip anything that doesn’t strike me as immediately interesting.
Song Exploder– have not spent a lot of time with this one yet. Concept is interesting, have an artist discuss the creation of one of their songs–the meaning, etc. This week I saw that they’ve got MGMT with Time to Pretend. So that should be worth a look.
Half Hour Intern– I want to like this podcast, but the episodes are hit and miss. The latest episode with the woman who does tattoos of nipples for breast cancer survivors was really interesting though.
Useful Science– in a nutshell, they discuss scientific papers and their quality/implications. Often times I can take it or leave it depending on the field of knowledge being discussed.
Podcasts I might revisit soon:
Intelligence Squared Debates – like many of the above, you sometimes have to skip around–but often these are very interesting. One very memorable debate had the topic, “The world would be better off without religion.” This was before all the current ISIS and domestic terrorism and I think if that debate were held again today it might sound very different.
Wait Wait Don’t Tell Me – I started listening to this at the behest of my sister. I like it, but often come away feeling like I could’ve been doing something else.
Today I added reCaptcha to the site via BestWebSoft Google Captcha plugin. We’ll see if it makes any dent in the numbers. Regardless, any drop is better than having to navigate 5,000 comments to see if anything wasn’t spam.
If any interesting numbers come out of it (read: statistics) I’ll share them here in a few weeks.
For Christmas this year I ended up building a bunch of RetroPi’s with Raspberry Pi 2 “kits” (they come with a case, RPi2 and power supply)
I also ended up getting a controller, for my dad I got a Buffalo SNES USB controller. For everyone else, especially if your planning to play anything more modern, I’d want something wireless and with analog sticks like the Logitech F710
On all three of the RPi’s I did for others I used SanDisk Ultra 32GB microSD cards. These were the cheapest among the brands I like and I’ve found SanDisk not to go corrupt on me.
With one of the Pi’s, a friend of mine wanted to use an X360 controller. That ended up being a boondoggle because, after installing the X360 driver from the menu, it tries to use two controller spaces. I’ve yet to resolve that issue, save for plugging it in last so it takes up the ports of the next two spaces after all other controllers are plugged in.
All that being said, everyone that got one absolutely loved it.
Resolving some problems
I also ran into a two problems, first with inputs not autoconfiguring in Retroarch. Worse than that, the cofigure controller for retroarch was missing from the retropie setup menu. I resolved the input issue by reflashing the image from Petrockblog (3.2.1). For some reason updating RetroPi from the binary was breaking autoconfig.
The second was with Player 2 input not working on the image I got from PetrockBlog (Retropi 3.2.1). However, I was able to resolve that by SSH’ing into the RPi and running the following commands (found here):
git clone https://github.com/libretro/stella-libretro.git
(this downloads the code to your pi; do it in your home directory)
(this navigates into the source code you just downloaded)
(this compiles the code)
(this navigates to the directory on your pi that houses the stella emulation library)
sudo cp stella_libretro.so stella_libretro.so.bak
(this makes a backup of your old stella library just in case)
sudo rm stella_libretro.so
(this deletes your old stella library to make room for the new one; the backup is still there)
sudo cp /home/pi/stella-libretro/stella_libretro.so stella_libretro.so
(this copies your newly-built stella library from the source code directory to the directory where your stella emulation library lives)
If something goes wrong, you can revert to your old stella library backup by doing:
(navigates you to the directory where your stella emulation library lives)
sudo rm stella_libretro.so
(this deletes your newly-built stella library to make room for your old backup to be restored)
sudo mv stella_libretro.so.bak stella_libretro.so
(this renames your backup to make it work with RetroArch again)
A few tips I learned:
Creating a ghost of the installation with everything I wanted on the SD Card was great for creating more RetroPies. I had to do all my setup on the first one and then cloned the card afterwards using Win32DiskImg.
Edit the input configuration so that hitting both analog sticks (or another combination if your controller lacks those buttons) to open the RetroArch menu. The config file is located at \\RETROPIE\all\configs\RetroArch.cfg and you can uncomment the section regarding a menu combo and use the corresponding number for each combo (search the file for combo).
Setting up SAMBA shares worked great for giving non-tech-savvy people the ability to add their own content.
Installing the smartphone controller experimental package was a nice addition for people who don’t want to have extra controllers lying around all the time but occasionally wanted to play with more people.
Most of the USB Wi-Fi adapters I tried had awful speeds, I ended up relying on Ethernet 90% of the time during my setups.
Update: 8/31/2015, I take it all back. It was only a temporary fix. My new diagnosis is something has corrupted my account or the mail app is mishandling things. Will update if I find something that works.
I’ve been pretty aggravated lately with a recurring error with Windows 10 built-in mail client with my work e-mail account.
I’ll admit it, I’m a bit of a geek when it comes to networking. I took a some CISCO classes in highschool and ended up really enjoying it. Because of that I often don’t like or don’t want the equipment that my ISP tries to provided me with.
That was the case with Fioptic’s Westell Ultraline Series 3 (WUS3). It uses a VDSL connection to connect multiple units in our condo building to the internet. I really wanted to start using my Asus N600 router as my primary LAN/Wi-Fi, but the WUS3 has a built-in router too. Rather than messing around with bridging the WUS3 into a modem only, I ended up using the DMZ host function to expose my ASUS to the internet.
If you’re reading this you’ve likely had the same problem (the bridge interface is a pain and doesn’t really work). In that case follow my instructions and hopefully you can not waste your Saturday evening trying to get your network setup the way you want it.
1. Enable the DMZ Host on the WUS3 via the homepage by clicking Firewall>DMZ Host and setting the DMZ to the address you’ll give the router (I used 192.168.200.2, the next logical increment after the WUS3’s IP).
2. Connect the Router’s WAN port to the any of the 4 ethernet ports on the switch of the WUS3.
3. Connect to the router and use the configuration or internet wizard to tell the Router to use the static IP we set as the DMZ host.
4. Check the connections and you should be on your way.
After consulting this thread and failing to get bridge, I reset the Westell one more time. This time I did the following and it worked!
1. click on my network
2. click on connections
3. click on WAN
4. go to bridging
5. bridge the WAN and ethernet (I’m a bit foggy here, if you try the same let me know what you did here so I can have a more complete description)
6. you’ll be prompted about changes to br0 and bridging WAN
7. that should be it, my router plugged into an ethernet port on the Westell’s built in switch got a DHCP address from the ISP.
Bonus: I left the wi-fi active (and out of the bridge) so I can still access the Westell even though it is currently in bridge mode.
One question I keep coming back to in my personal and professional experiences is how has small town X not started using Y or Z technology?
Obviously this is an opinion piece, but let me write it down just in case–these are my opinions and do not represent those of my current and previous employers.
This question has come from a laundry list of experiences I’ve had working, driving through, or living in rural/small towns. Most frustrating was my experience with a small city that would only accept payment for traffic tickets with certified checks, ignoring the progress in payment systems that so many places seem to have made. Many cities now accept personal checks, credit cards, or even PayPal. We live in an era where E-GovLink allows municipalities to accept BitCoin. I think this highlights the gap in technology adoption and the digital divide between rural and urban places.
A Digital Divide
Access to computers and technology is still an issue in many places. Kentucky continues to operate its Broadband KY Initiative with the hope of wiring up more homes to the internet. Google Fiber is making progress to offer free internet access to people and non-profits and community organizations. I would have guessed that by 2014 most places would at least have a web presence. Instead, I often search for cities and counties on Google and find that they have zero presence.
One example of the successful adoption of social media and web presence is the Brimfield, Ohio Police Department. One of the most used parts of the City of St. Bernard’s website was the tax department webpage (disclosure, I worked for the city from 2009-2011). The Tax Department offers digital copies of their tax documents and information for residents ultimately lessening the calls the city get about taxes. The City of Cincinnati has made my life a bit easier by offering information through their Public Services Twitter account, connecting with residents to let them know about snow removal efforts, remind residents about winter safety, and posting about snow emergencies. Cincinnati has a whole slew of twitter accounts and Facebook pages, all helping residents connect with and get information from the city.
Social media and web presence are often discussed as mandatory in comprehensive planning efforts. One particularly cool recent example is MyNKY, Vision 2015’s new public participation campaign. Using an interactive game (and what seems a lot like dot voting) participants contribute a quantitative opinion about spending priorities and follow-up with qualitative descriptions in a targeted survey. Not only is it easy, but the quant-qual pairing makes it easy to for MyNKY to demand preferences and then drill down into what those preferences mean to each participant. Not only is it efficient, it minimizes the time commitment to participate.
A bit more complicated than an easy button:
With so many examples of best practices, why don’t more cities use the web and social media for public engagement and information distribution? In my experience it comes down to a lack of available talent and budget priorities. My evidence about the talent needs of cities is anecdotal at best, but most places I’ve worked with that lack these elements also have an institutional attitude with little or no interest in websites, social media, or technology; essentially there needs to be organizational will to make it happen.
Even with the will to make it happen, figuring out how to finance it is not easy. Budgets continue to grow tight and there is no easy way to make room for an IT budget or department (especially in the smallest of cities). Some cities are already facing impending cuts to services or employees and cannot possibly cram in an IT budget.
One of the benefits of my current job with the NKADD is that I get to share my experience with technology, web development, and social media with the jurisdictions in our service area.
That doesn’t mean sharing my experience will always result in successful websites and a social media presence. Social media can require near constant monitoring and some citizens come to expect almost instantaneous response. Setting clear definition about the kinds of communication possible and when that communication can happen is difficult. Websites also need work to maintain them and take time to develop appropriate content.
Even in spite of the financial and labor requirements of having a digital presence, I think it is still worthwhile for cities to pursue. The possible benefit to efficiency, public interaction, and engaging younger generations is too critical to continue to ignore it altogether.