 API
 AV
 Code refactoring
 Extreme Programming
 final solution
 GitHub
 Hypothesis testing
 Kent Beck
 legacy systems
 Michael Feathers
 overall solution
 Software development
 software development
 Software development process
 Software testing
 Statistical hypothesis testing
 Statistical inference
 Statistics
 Unit testing
 Unittesting frameworks for Ruby
 What am
 XTR
I didn't intend for Please Don't Learn to Code to be so controversial, but it seemed to strike a nerve. Apparently a significant percentage of readers stopped reading at the title.
So I will open with my own story. I think you'll find it instructive.
My mom once told me that the only reason she dated my father is because her mother told her to stay away from that boy, he's a bad influence.
If she had, I would not exist.
True story, folks.
I'd argue that the people who need to learn to code will be spurred on most of all by honesty, not religious faith in the truthiness of code as a universal good. Go in knowing both sides of the story, because there are no silver bullets in code. If, after hearing both the pros and cons, you still want to learn to code, then by all means learn to code. If you're so easily dissuaded by hearing a few downsides to coding, there are plenty of other things you could spend your time learning that are more unambiguously useful and practical. Per Michael Lopp, you could learn to be a better communicator. Per Gina Trapani, you could learn how to propose better solutions. Slinging code is just a tiny part of the overall solution in my experience. Why optimize for that?
On the earliest computers, everyone had to be a programmer because there was no software. If you wanted the computer to do anything, you wrote code. Computers in the not so distant past booted directly to the friendly blinking cursor of a BASIC interpreter. I view the entire arc of software development as a field where we programmers spend our lives writing code so that our fellow human beings no longer need to write code (or even worse, become programmers) to get things done with computers. So this idea that "everyone must know how to code" is, to me, going backwards.
I fully support a push for basic Internet literacy. But in order to be a competent driver, does everyone need to know, in detail, how their automobile works? Must we teach all human beings the basics of being an auto mechanic, and elevate shop class to the same level as English and Mathematics classes? Isn't knowing how to change a tire, and when to take your car in for an oil change, sufficient? If your toilet is clogged, you shouldn't need to take a two week in depth plumbing course on toiletcademy.com to understand how to fix that. Reading a single web page, just in time, should be more than adequate.
What is code, in the most abstract sense?
code (kōd) …

 A system of signals used to represent letters or numbers in transmitting messages.
 A system of symbols, letters, or words given certain arbitrary meanings, used for transmitting messages requiring secrecy or brevity.
 A system of symbols and rules used to represent instructions to a computer…
— The American Heritage Dictionary of the English Language
Is it punchcards? Remote terminals? Emacs? Textmate? Eclipse? Visual Studio? C? Ruby? JavaScript? In the 1920s, it was considered important to learn how to use slide rules. In the 1960s, it was considered important to learn mechanical drawing. None of that matters today. I'm hesitant to recommend any particular approach to coding other than the fundamentals as outlined in Code: The Hidden Language of Computer Hardware and Software, because I'm not sure we'll even recognize coding in the next 20 or 30 years. To kids today, perhaps coding will eventually resemble Minecraft, or building levels in Portal 2.
But everyone should try writing a little code, because it somehow sharpens the mind, right? Maybe in the same abstract way that reading the entire Encyclopedia Brittanica from beginning to end does. Honestly, I'd prefer that people spend their time discovering what problems they love and find interesting, first, and researching the hell out of those problems. The toughest thing in life is not learning a bunch of potentially hypothetically useful stuff, but figuring out what the heck it is you want to do. If said research and exploration leads to coding, then by all means learn to code with my blessing … which is worth exactly what it sounds like, nothing.
So, no, I don't advocate learning to code for the sake of learning to code. What I advocate is shamelessly following your joy. For example, I received the following email yesterday.
I am a 45 year old attorney/C.P.A. attempting to abandon my solo law practice as soon as humanly possible and strike out in search of my next vocation. I am actually paying someone to help me do this and, as a first step in the "find yourself" process, I was told to look back over my long and winding career and identify those times in my professional life when I was doing something I truly enjoyed.
Coming of age as an accountant during the PC revolution (when I started my first "real" job at Arthur Andersen we were still billing clients to update depreciation schedules manually), I spend a lot of time learning how to make computers, printers, and software (VisiCalc anyone?) work. This quasitechnical aspect of my work reached its apex when I was hired as a healthcare financial analyst for a large hospital system. When I arrived for my first day of work in that job, I learned that my predecessor had bequeathed me only a one page static Excel spreadsheet that purported to "analyze" a multimillion dollar managed care contract for a seven hospital health system. I proceeded to build my own spreadsheet but quickly exceeded the database functional capacity of Excel and had to teach myself Access and thereafter proceeded to stretch the envelope of Access' spreadsheet capabilities to their utmost capacity – I had to retrieve hundreds of thousands of patient records and then perform pro forma calculations on them to see if the proposed contracts would result in more or less payment given identical utilization.
I will be the first to admit that I was not coding in any professional sense of the word. I did manage to make Access do things that MS technical support told me it could not do but I was still simply using very basic commands to bend an existing application to my will. The one thing I do remember was being happy. I typed infinitely nested commands into formula cells for twelve to fourteen hours a day and was still disappointed when I had to stop.
My experience in building that monster and making it run was, to date, my most satisfying professional accomplishment, despite going on to later become CFO of another healthcare facility, a feat that should have fulfilled all of my professional ambitions at that time. More than just the work, however, was the group of likeminded analysts and IT folks with whom I became associated as I tried, failed, tried, debugged, and continued building this behemoth of a database. I learned about Easter Eggs and coding lore and found myself hacking into areas of the hospital mainframe which were completely offlimits to someone of my paygrade. And yet, I kept pursuing my "professional goals" and ended up in jobs/careers I hated doing work I loathed.
Here's a person who a) found an interesting problem, b) attempted to create a solution to the problem, which naturally c) led them to learning to code. And they loved it. This is how it's supposed to work. I didn't become a programmer because someone told me learning to code was important, I became a programmer because I wanted to change the rules of the video games I was playing, and learning to code was the only way to do that. Along the way, I too fell in love.
All that to say that as I stand at the crossroads once more, I still hear the siren song of those halcyon days of quasicoding during which I enjoyed my work. My question for you is whether you think it is even possible for someone of my vintage to learn to code to a level that I could be hired as a programmer. I am not trying to do this on the side while running the city of New York as a day job. Rather, I sincerely and completely want to become a bona fide programmer and spend my days creating (and/or debugging) something of value.
Unfortunately, calling yourself a "programmer" can be a careerlimiting move, particularly for someone who was a CFO in a previous career. People who work with money tend to make a lot of money; see Wall Street.
But this isn't about money, is it? It's about love. So, if you want to be a programmer, all you need to do is follow your joy and fall in love with code. Any programmer worth their salt immediately recognizes a fellow true believer, a person as madly in love with code as they are, warts and all. Welcome to the tribe.
And if you're reading this and thinking, "screw this Jeff Atwood guy, who is he to tell me whether I should learn to code or not", all I can say is: good! That's the spirit!
[advertisement] Hiring developers? Post your open positions with Stack Overflow Careers and reach over 20MM awesome devs already on Stack Overflow. Create your satisfactionguaranteed job listing today!
 API
 Arthur C. Clarke
 Australia
 Babbage
 BASIC
 basic Internet literacy
 Bloomberg
 Brendan Abel
 C
 car analogy
 car mechanics
 car transmissions
 Chris Vann
 CSS filter
 Douglas Rushkoff
 energy
 energy regulation etc
 favorite search engine
 Gina Trapani
 healthcare
 healthcare facility
 HTML
 HTML
 input/output device
 Ireland
 Ive
 Java
 Java
 JavaScript
 Jeff Atwood
 Kathy Sierra
 Linux
 Michael Lopp
 New York
 oil change
 operating system
 Oracle
 overall solution
 paint
 printer/software
 printing press
 realworld programming tools
 RPG
 search engine
 software developers
 software development
 software monopoly
 solo law practice
 Structured programming
 T.Y.S.O.N.
 United States
 Viagra
 VisiCalc
 word processing
Dynamic programming is both a mathematical optimization method and a computer programming method. In both contexts it refers to simplifying a complicated problem by breaking it down into simpler subproblems in a recursive manner. While some decision problems cannot be taken apart this way, decisions that span several points in time do often break apart recursively; Bellman called this the "Principle of Optimality". Likewise, in computer science, a problem that can be broken down recursively is said to have optimal substructure.
If subproblems can be nested recursively inside larger problems, so that dynamic programming methods are applicable, then there is a relation between the value of the larger problem and the values of the subproblems.^{[5]} In the optimization literature this relationship is called the Bellman equation.
Dynamic programming in mathematical optimization
In terms of mathematical optimization, dynamic programming usually refers to simplifying a decision by breaking it down into a sequence of decision steps over time. This is done by defining a sequence of value functions V_{1}, V_{2}, ..., V_{n}, with an argument y representing the state of the system at times i from 1 to n. The definition of V_{n}(y) is the value obtained in state y at the last time n. The values V_{i} at earlier times i = n −1, n − 2, ..., 2, 1 can be found by working backwards, using a recursive relationship called the Bellman equation. For i = 2, ..., n, V_{i−1} at any state y is calculated from V_{i} by maximizing a simple function (usually the sum) of the gain from decision i − 1 and the function V_{i} at the new state of the system if this decision is made. Since V_{i} has already been calculated for the needed states, the above operation yields V_{i−1} for those states. Finally, V_{1} at the initial state of the system is the value of the optimal solution. The optimal values of the decision variables can be recovered, one by one, by tracking back the calculations already performed.
Dynamic programming in computer programming
There are two key attributes that a problem must have in order for dynamic programming to be applicable: optimal substructure and overlapping subproblems. However, when the overlapping problems are much smaller than the original problem, the strategy is called "divide and conquer" rather than "dynamic programming". This is why mergesort, quicksort, and finding all matches of a regular expression are not classified as dynamic programming problems.
Optimal substructure means that the solution to a given optimization problem can be obtained by the combination of optimal solutions to its subproblems. Consequently, the first step towards devising a dynamic programming solution is to check whether the problem exhibits such optimal substructure. Such optimal substructures are usually described by means of recursion. For example, given a graph G=(V,E), the shortest path p from a vertex u to a vertex v exhibits optimal substructure: take any intermediate vertex w on this shortest path p. If p is truly the shortest path, then the path p_{1} from u to w and p_{2} from w to v are indeed the shortest paths between the corresponding vertices (by the simple cutandpaste argument described in CLRS). Hence, one can easily formulate the solution for finding shortest paths in a recursive manner, which is what the BellmanFord algorithm or the FloydWarshall algorithm does.
Overlapping subproblems means that the space of subproblems must be small, that is, any recursive algorithm solving the problem should solve the same subproblems over and over, rather than generating new subproblems. For example, consider the recursive formulation for generating the Fibonacci series: F_{i} = F_{i−1} + F_{i−2}, with base case F_{1} = F_{2} = 1. Then F_{43} = F_{42} + F_{41}, and F_{42} = F_{41} + F_{40}. Now F_{41} is being solved in the recursive subtrees of both F_{43} as well as F_{42}. Even though the total number of subproblems is actually small (only 43 of them), we end up solving the same problems over and over if we adopt a naive recursive solution such as this. Dynamic programming takes account of this fact and solves each subproblem only once. Note that the subproblems must be only slightly smaller (typically taken to mean a constant additive factor^{[citation needed]}) than the larger problem; when they are a multiplicative factor smaller the problem is no longer classified as dynamic programming.
Figure 2. The subproblem graph for the Fibonacci sequence. The fact that it is not a tree indicates overlapping subproblems.
This can be achieved in either of two ways:^{[citation needed]}
 Topdown approach: This is the direct fallout of the recursive formulation of any problem. If the solution to any problem can be formulated recursively using the solution to its subproblems, and if its subproblems are overlapping, then one can easily memoize or store the solutions to the subproblems in a table. Whenever we attempt to solve a new subproblem, we first check the table to see if it is already solved. If a solution has been recorded, we can use it directly, otherwise we solve the subproblem and add its solution to the table.
 Bottomup approach: Once we formulate the solution to a problem recursively as in terms of its subproblems, we can try reformulating the problem in a bottomup fashion: try solving the subproblems first and use their solutions to buildon and arrive at solutions to bigger subproblems. This is also usually done in a tabular form by iteratively generating solutions to bigger and bigger subproblems by using the solutions to small subproblems. For example, if we already know the values of F_{41} and F_{40}, we can directly calculate the value of F_{42}.
Some programming languages can automatically memoize the result of a function call with a particular set of arguments, in order to speed up callbyname evaluation (this mechanism is referred to as callbyneed). Some languages make it possible portably (e.g. Scheme, Common Lisp or Perl), some need special extensions (e.g. C++, see^{[6]}). Some languages have automatic memoization built in, such as tabled Prolog and J, which supports memoization with the M. adverb^{[7]}. In any case, this is only possible for a referentially transparent function.
Algorithms that use dynamic programming
 Recurrent solutions to Lattice models for proteinDNA binding
 Backward induction as a solution method for finitehorizon discretetime dynamic optimization problems
 Method of undetermined coefficients can be used to solve the Bellman equation in infinitehorizon, discretetime, discounted, timeinvariant dynamic optimization problems
 Many string algorithms including longest common subsequence, longest increasing subsequence, longest common substring, Levenshtein distance (edit distance).
 Many algorithmic problems on graphs can be solved efficiently for graphs of bounded treewidth or bounded cliquewidth by using dynamic programming on a tree decomposition of the graph.
 The Cocke–Younger–Kasami (CYK) algorithm which determines whether and how a given string can be generated by a given contextfree grammar
 Knuth's word wrapping algorithm that minimizes raggedness when word wrapping text
 The use of transposition tables and refutation tables in computer chess
 The Viterbi algorithm (used for hidden Markov models)
 The Earley algorithm (a type of chart parser)
 The Needleman–Wunsch and other algorithms used in bioinformatics, including sequence alignment, structural alignment, RNA structure prediction.
 Floyd's AllPairs shortest path algorithm
 Optimizing the order for chain matrix multiplication
 Pseudopolynomial time algorithms for the Subset Sum and Knapsack and Partition problem Problems
 The dynamic time warping algorithm for computing the global distance between two time series
 The Selinger (a.k.a. System R) algorithm for relational database query optimization
 De Boor algorithm for evaluating Bspline curves
 Duckworth–Lewis method for resolving the problem when games of cricket are interrupted
 The Value Iteration method for solving Markov decision processes
 Some graphic image edge following selection methods such as the "magnet" selection tool in Photoshop
 Some methods for solving interval scheduling problems
 Some methods for solving word wrap problems
 Some methods for solving the travelling salesman problem, either exactly (in exponential time) or approximately (e.g. via the bitonic tour)
 Recursive least squares method
 Beat tracking in Music Information Retrieval.
 Adaptive Critic training strategy for artificial neural networks
 Stereo algorithms for solving the Correspondence problem used in stereo vision.
 Seam carving (content aware image resizing)
 The Bellman–Ford algorithm for finding the shortest distance in a graph.
 Some approximate solution methods for the linear search problem.
 Kadane's algorithm for the Maximum subarray problem.
See also
 Algorithm
 Bellman equation
 cs
 Dynamic programming
 dynamic programming solution
 Equations
 exponential time algorithm
 exponential time algorithm
 FloydWarshall algorithm
 Ford
 Hanoi
 interactive online facility
 Mathematical optimization
 Mathematics
 naive recursive solution
 Operations research
 Optimal control
 Optimal substructure
 Optimization
 overall solution
 Overlapping subproblem
 Perl
 possible solutions
 programming
 Prolog
 recursive algorithm
 recursive algorithm
 Richard Bellman
 Shortest path problem