You need multiple things to come together. Some are abstract and general, some are more specific. You need to know the tools you are using, the capabilities of your platforms.
At some lowest level it is about solving problems and knowing how to drill down and understand requirements, split a large problem into sub-problems. In that respect it is not unlike an architect or an engineer or carpenter.
Then it comes to about learning how to translate to an intermediate mental representation. Here where you you already think about the typical language paradigms. Like someone used to object oriented design starts seeing objects, factories, inheritance hierarchies. Someone used to actor based programming (like Erlang) starts seeing actors sending messages to each other. A Haskell programmer starts thinking about type-classes (presumably, I am not one).
Then it comes down writing it. Here one needs to know the tools in detail. The specific language, platform, API. Say something object oriented at the higher level could become something in C++, Java or Python here. Here is where Google-ing helps. Knowing that say oh Python has this cool built-in module that can help me. Or yeah, I'll use a smart pointer here. Well you have to know about the smart pointers.
Then here is boring but important stuff. Knowing how to use version control. Knowing how to estimate time, communicate with team-mates. Not being stupid about security.
Anyway a lot of things have to come together to make someone a successful programmer in the general sense. Some acquire these capacities formally by going to college. Some acquire these on their own.
The interesting thing is that regardless of how it was acquired, the usual instinct is to discount it while looking back. "Oh I just write Python, here is a book on syntax, learn it and start programming". That always somehow seems easy to programmers. And I have made the that mistake. Friends and family member I have tried to show/teach programming.
I've found the hardest thing about teaching people to program has become that there is a huge learning curve between the "simple" stuff and making real progress. I've been working on a device which will level that out a bit with an easier intro level.
I've found that interpretive languages, Python, Perl, or even BASIC can be less difficult to start because you can start exploring them right away. Compiled languages are harder because of the intermediate step.
"And I have made the that mistake. Friends and family member I have tried to show/teach programming."
I'm just curious, knowing what you know now, how would you go about trying to show someone how to program?
I may have that opportunity soon and I was going to suggest thinking about a simple yet everyday problem that arises, and try to make it better with an app (could be phone, web app, client app, etc).
I learned programming with HTML/PHP. It's great in that HTML is really easy to understand and lets you draw colorful text and pictures to the screen almost immediately. Then, you can jump to Turing-completeness with a new kind of tag, "<?PHP" that can handle form inputs and generate new HTML!
And then get them over to Python or something ASAP. PHP doesn't have enough structure for a beginner to learn good software design principles, it's best kept to just teaching a thing or two about loops and control structures.
Teaching HTML + JS is tempting, but then you kinda have to explain the whole DOM too, which just makes things complicated.
1. I try to find out something that the person already understands, so I can frame my metaphors around that.
2. Maybe I'm projecting, but I was tinkering with computers for a long time before I started learning to program. To that end, there's a lot of just intuitive understanding of software that you absorb, basic relationships like configuration files, rebooting, memory, what a browser does, what a URL variable is etc etc.
There's a lot to be said for helping a person round out this knowledge, before you start teaching them how to program.
I don't know that a person needs to know a lot about America before learning American English, but I wouldn't be surprised to find out that it helps. "You see, they're kind of a casual low key people, so their language has a lot of slang and elisions..."
Learn to question everything.
Use CodeAcademy, KhanAcademy, and w3schools.com.
Don't listen to people who tell you something is a horrible idea. You're far to ignorant on programming to have your own opinion and accepting someone else's will just stunt your progress.
It's like learning to speak another language; you're talking to a child. You have to tell/teach it everything about it's life. Eventually it'll be able to live by it's self.
When you're ready for application programming, look for topics on c/c++ and other languages that people seem to love. Find out what your favorite game/website is built in and see if you can replicate it.
I'm teaching a relatively mathematically-minded friend programming and I've found http://repl.it/languages/Scheme to be an excellent resource. The simple syntax and semantics of Scheme and its easy accessibility allows them to quickly develop an understanding of the fundamentals, and from there it's not much work for them to download DrRacket and start on more practical applications.
I think games are a great place to start; simple, humorous text-based adventures, for instance, are both really fun to write and teach a lot of the basics.
I would be more patient. I would start with something they want to do, like a practical project. "make a garage door opener". Or make a "robot controller", beer brewing temperature probe with a web server, an app to teach morse code.
The more abstract aspects often become internalized through experience and through concrete examples. Say I didn't care about all the network or socket programming until I had to implement a chat-like server. Looking back thinking "oh yeah algorithms are important" but I only internalized that until I picked the wrong one and got horrible performance. Or until I started interviewing and everyone and their cousin wanted algorithms -- so yes they are important in getting a good job ;-).
Anyway this "example before abstract" is just my personal idea. Googling for example is something I do very often. Say when I get to an API section, I quickly look for example first. Man pages that have examples.
I've found the hardest thing for beginner programmers is being in a state where they can actually run their code. That's why I think NodeJS and Javascript are so awesome. The code just runs, better yet, Javascript (perhaps not NodeJS) can run in the browser (there's the developer console, even JSFiddle)!
What I think is cool about programming is that if you can understand a book, like the C Programming Language for example, then you can probably program. Although, being able to understand that book may take a little more work than just reading it (you might need some supplementary material). A lot of other pieces are soft skills for becoming a professional software engineer, such as time estimation, communication, version control, etc (soft skills in that they can likely be picked up on the job).
Yes, but the coding part is usually the easiest party about being a programmer. All the other skills and knowledge you mention are what makes me far more valuable than when I started 14 years ago.
I think there is a huge distinction between learning how to program, and becoming a valuable programmer. We need to decrease the barrier to entry, so that the more valuable skills (such as abstraction, optimization, security, etc.) are reachable for everyone and anyone.
At some lowest level it is about solving problems and knowing how to drill down and understand requirements, split a large problem into sub-problems. In that respect it is not unlike an architect or an engineer or carpenter.
Then it comes to about learning how to translate to an intermediate mental representation. Here where you you already think about the typical language paradigms. Like someone used to object oriented design starts seeing objects, factories, inheritance hierarchies. Someone used to actor based programming (like Erlang) starts seeing actors sending messages to each other. A Haskell programmer starts thinking about type-classes (presumably, I am not one).
Then it comes down writing it. Here one needs to know the tools in detail. The specific language, platform, API. Say something object oriented at the higher level could become something in C++, Java or Python here. Here is where Google-ing helps. Knowing that say oh Python has this cool built-in module that can help me. Or yeah, I'll use a smart pointer here. Well you have to know about the smart pointers.
Then here is boring but important stuff. Knowing how to use version control. Knowing how to estimate time, communicate with team-mates. Not being stupid about security.
Anyway a lot of things have to come together to make someone a successful programmer in the general sense. Some acquire these capacities formally by going to college. Some acquire these on their own.
The interesting thing is that regardless of how it was acquired, the usual instinct is to discount it while looking back. "Oh I just write Python, here is a book on syntax, learn it and start programming". That always somehow seems easy to programmers. And I have made the that mistake. Friends and family member I have tried to show/teach programming.