Categories
Photos

New Jersey Festival of Ballooning

Categories
Space

Pluto

Pluto
Photo:NASA / JHUAPL / SwRI

Planet or not, roughly 100 years after the first powered flight, we’ve sent a vehicle equipped with a camera to Pluto to take pictures and wirelessly transmit them back. That’s nothing short of amazing.

Categories
Photos

Fireworks From Liberty State Park

Fireworks From Liberty State Park

Took some photos from down by Liberty State Park for the 4th of July

Categories
Open Source

Adventures In KVM Land

Hopefully this saves someone doing a Google search some time. Running something like sudo ubuntu-vm-builder kvm trusty tahir on Ubuntu 14.04 at least seems to sometimes generate the following error (emphasis mine):

Preparing to unpack .../linux-image-virtual_3.13.0.52.59_amd64.deb ...
Unpacking linux-image-virtual (3.13.0.52.59) ...
, stderr: grep: /proc/cpuinfo: No such file or directory
This kernel does not support a non-PAE CPU.
dpkg: error processing archive /var/cache/apt/archives/linux-image-3.13.0-52-generic_3.13.0-52.86_amd64.deb (--unpack):
 subprocess new pre-installation script returned error exit status 1
Examining /etc/kernel/postrm.d .
run-parts: executing /etc/kernel/postrm.d/initramfs-tools 3.13.0-52-generic /boot/vmlinuz-3.13.0-52-generic
run-parts: executing /etc/kernel/postrm.d/zz-update-grub 3.13.0-52-generic /boot/vmlinuz-3.13.0-52-generic
Errors were encountered while processing:
 /var/cache/apt/archives/linux-image-3.13.0-52-generic_3.13.0-52.86_amd64.deb
E: Sub-process /usr/bin/dpkg returned an error code (1)

Ends up this is an older known bug. Adding --addpkg linux-image-generic seems to work as recently as Trusty Tahir.

Categories
Audio/Video Funny

The One With The Guy Obsessed With Friends

Friends Obsessed Guy

This guy might be more than a little obsessed. I must admit his attention to detail is amazing though.

Categories
General Mozilla Software

On Deprecating HTTP

Mozilla announced:

There’s pretty broad agreement that HTTPS is the way forward for the web. In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications, which in the case of the web means HTTPS.

I’m on board with this development 100%. I say this as a web developer who has, and will face some uphill battles to bring everything into HTTPS land. It won’t happen immediately, but the long-term plan is 100% HTTPS . It’s not the easiest move for the internet, but it’s undoubtedly the right move for the internet.

A brief history

The lack of encryption on the internet is not to different from the weaknesses in email and SMTP that make spam so prolific. Once upon a time the internet was mainly a tool of academics, trust was implicit and ethics was paramount. Nobody thought security was of major importance. Everything was done in plain text for performance and easy debugging. That’s why you can use telnet to debug most older popular protocols.

In 2015 the landscape has changed. Academic use of the internet is a small fraction of its traffic. Malicious traffic is a growing concern. Free sharing of information, the norm in the academic world is the exception in some of the places the internet reaches.

Protecting the user

Users deserve to be protected as much as technology will allow. Some folks claim “non-sensitive” data exist. I disagree with this as it’s objective and a matter of personal perspective. What’s sensitive to someone in a certain situation is not sensitive to others. Certain topics that are normal and safe to discuss in most of the world are not safe in others. Certain search queries are more sensitive than others (medical questions, sensitive business research). A web developer doesn’t have a good grasp of what is sensitive or not. It’s specific to the individual user. It’s not every network admin’s right to know if someone on their network browsed and/or purchased pregnancy tests or purchased a book on parenting children with disabilities on Amazon. The former may not go over well at a “free” conservative school in the United States for example. More than just credit card information is considered “sensitive data” in this case. Nobody should be so arrogant as to think they understand how every person on earth might come across their website.

Google and Yahoo took the first step to move search to HTTPS (Bing still seems to be using HTTP oddly enough). This is the obvious second step to protecting the world’s internet users.

Protecting the website’s integrity

Michelangelo David - CensoredUnfortunately you can no longer be certain a user sees a website as you intended it as a web developer. Sorry, but it doesn’t work that way. For years ISP’s have been testing the ability to do things like insert ads into webpages. As far as I’m aware in the U.S. there’s nothing explicitly prohibiting replacing ads. Even net neutrality rules seem limited to degrading or discriminating against certain traffic, not modifying payloads.

I’m convinced the next iteration of the great firewall will not explicitly block content, but censor it. It will be harder to detect than just being denied access to a website. The ability to do large-scale processing like this is becoming more practical. Just remove the offending block of text or image. Citizens of oppressed nations will possibly not notice a thing.

There’s also been attempts to “optimize” images and video. Again even net-neutrality is not entirely clear assuming this isn’t targeted to competitors for example.

But TLS isn’t perfect!

True, but let’s be honest, it’s 8,675,309 times better than using nothing. CA’s are a vulnerability, they are a bottleneck, and a potential target for governments looking to control information. But browsers and OS’s allow you to manage certificates. The ability to stop trusting CA’s exists. Technology will improve over time. I don’t expect us to be still using TLS 1.1 and 1.2 in 2025. Hopefully substantial improvements get made over time. This argument is akin to not buying a computer because there will be a faster one next year. It’s the best option today, and we can replace it with better methods when available.

SSL Certificates are expensive!

First of all, domain validation certificates can be found for as little as $10. Secondly, I fully expect these prices to drop as demand increases. Domain verification certificates have virtually no cost as it’s all automated. The cheaper options will experience substantial growth as demand grows. There’s no limit in “supply” except computing power to generate them. A pricing war is inevitable. It would happen even faster if someone like Google bought a large CA and dropped prices to rock bottom. Certificates will get way cheaper before it’s essential. $10 is the early adopter fee.

But XYZ doesn’t support HTTPS!

True, not everyone is supporting it yet. That will change. It’s also true some (like CDN’s) are still charging insane prices for HTTPS. It’s not practical for everyone to switch today. Or this year. But that will change as well as demand increases. Encryption overhead is nominal. Once again pricing wars will happen once someone wants more than their shopping cart served over SSL. The problem today is demand is minimal, but those who need it must have it. Therefore price gouging is the norm.

Seriously, we need to do this?

Yes, seriously. HTTPS is the right direction for the Internet. There’s valid arguments for not switching your site over today, but those roadblocks will disappear and you should be re-evaluating where you stand periodically. I’ve moved a few sites including this blog (SPDY for now, HTTP/2 soon) to experience what would happen. It was largely a smooth transition. I’ve got some sites still on HTTP. Some will be on HTTP for the foreseeable future due to other circumstances, others will switch sooner. This doesn’t mean HTTP is dead tomorrow, or next year. It just means the future of the internet is HTTPS, and you should be part of it.

Categories
Photos

Urban Landscape

Took some photos on the street of NYC and mostly of the Pep Boys in Jersey City that’s about to be demolished and is currently a canvas.

Categories
Open Source Software

MySQL To Percona Server Gottcha

Decided to replace the aging MySQL 5.1.x on a CentOS Box with a newer Percona Server 5.6. First step was to update MySQL 5.1 to 5.5. This went relatively smoothly after I figured out some mySQL transaction kung-fu and ran mysql_upgrade. Step two was to replace it with Percona Server. It installed fine. Almost to simple. So naturally I ran:

etc/init.d/mysql start

which resulted in a dreaded:

Starting MySQL (Percona Server).... ERROR! The server quit without updating PID file (/var/lib/mysql/SERVERNAME.pid)

After a few minutes of pouring through the logs I noticed this little nugget:

2015-04-25 19:18:16 18234 [ERROR] /usr/sbin/mysqld: unknown variable 'table_cache=7K'

Apparently around MySQL 5.1.3 they replaced table_cache with table_open_cache. A simple rename in my.ini, and we’re on our way. Now running a little faster thanks to some much newer DB binaries.

Categories
Tech (General)

AeroFarms Using Technology To Grow Food

There’s been a fair amount of research in the past few years about more scalable and environmentally friendly ways to grow food. AeroFarms is embarking on an ambitious plan to put some of technology to work with a Newark, NJ based farm in an old factory.

The basic idea is using technology like energy efficient LED lighting, carefully monitored watering and nutrition by using a mist (vs planting in soil). Space is optimized by stacking them similar to servers in a data center. The end result is you can grow edible plants in a very fast, efficient, predictable way. No more weather ruining crops, thousands of acres of land devoted to farming.

Just imagine if one day this could get deployed to places like Africa. Efficient solar powered farming could change how Africa does food. It could become much more practical to farm in desert places.

Categories
Tech (General)

F-35 Augmented Reality Helmet

F35 Helmet

The F-35 helmet is one of the most impressive things you’ll see technology wise this year. It will take a long while, but eventually augmented reality will get better, more compact, and cheaper until it will his the civilian market. No more “blind spots” in a car. No more being unable to see the obstruction in front of the large truck ahead of you. I think there’s a good chance we’ll see this hit the market before we’ll see self driving cars prolific enough to remove the “driver”.