As the guy who wrote one of the first open-source status update systems in 2007, the main developer behind Acquia Commons social business software 2.x, and an evangelist of social communications technologies, writing the title of this post feels strange. I've spent the last 5 years of my life building software to make it easy for people to build social networks, so why would I suggest that sometimes you shouldn't do it?
This is a slightly edited response I recently wrote to someone who asked how to learn skills that would be useful at a hackathon. It's my usual response when someone asks how to get started programming.
You should start by approaching the problem from a different perspective. You should be thinking "I want to build X. Now what do I need to learn to build that?" not "I want to learn to build stuff. What can I learn?"
Tim from Crowdcademy recently wrote about the ugly side of programming:
I've also discovered that learning to code can have a big impact on your personality. Coding uses a lot of thinking patterns that I hadn't really used since my math and statistics classes in college, and even back then not in this intensity. As a result I've become more focused, more logical and smarter. But I've also become more detached from everyday life and less fun to hang out with.
The other day I read an article about global warming, and something about it keeps bugging me.1 My initial reaction was that someone would figure it all out; someone always does. But "someone" doesn't seem to be getting very far this time, and this is a big, important, world-changing problem. So, I thought, why is that "someone" not me?
Last weekend I created Nemesis, a 3D First-Person Shooting game completely in the browser using WebGL with Three.js. I'm really excited that this is possible in the browser and that I was able to do it with no prior WebGL or Three.js experience in 23 hours for the AngelHack hackathon. This post is intended to explain the code so that other people can do the same.
Like most things in this world, the question of whether cloud hosting is for you is not black and white. Since you're reading this it's pretty likely that you've already read The Cloud Is Not for You and the counterpoint at Heroku Isn't for Idiots. Though both pieces are well-written and offer useful (if pointed) insight, what everyone actually wants to know is when to use each kind of hosting.
A lot of people think about programming as some huge, difficult discipline that you sit down and learn like you would learn History or Math. I think I'll learn how to code today, one might say, and I've really been looking forward to that quantum physics class.
Here's the thing: almost no one learns how to code in a classroom, by hearing about it or by reading about it. People learn how to code by doing it, like driving a car. But most people learn how to drive a car because it gets them from Point A to Point B, not because driving is fun. Lots of people drive for fun, but hundreds of millions of people slog through traffic on their way to work every day.
I have a problem with mobile-first design.
I spend a lot of time every day sitting in front of three 1920x1080 screens. That's 6,220,800 pixels to play with, and web developers are not using them well. Take Twitter.com, for example: the Tweet column is 496 pixels wide. That's 26% of the width of just one of my screens for all of the content on the site that I'm supposed to read and engage with. When I'm sitting 3 feet away, the text is small, and it's a small target for my mouse (I've sped up the cursor so I can efficiently pan across screens).
Isaac is a student with a passion for technology, entrepreneurship, and. He also pursues , politics, and music. But mostly he builds things.