It's Ada Lovelace Day. If that doesn't mean anything then, shame on you! She has the notable distinction of being considered the first person to ever write a program for a computer. She was also British and died in 1852. Yes, you read that date correctly.
The Honorable Augusta Ada Byron has a story that will half fascinate, half surprise you. She was born in December 1815, and has the historic position of being the only legitimate child of Lord Byron—the poet—and his wife Anna Byron. Anna's bitterness at her dramatic, wayward husband is said to have been why she pushed the young Ada toward the pursuit of math and logic.
As an adult Lovelace styled herself as a poetical scientist, analyst, and metaphysician and she was acquainted with a few people you may have heard of: Charles Dickens, Michael Faraday, Charles Wheatstone (developer of the famous Wheatstone bridge electrical circuit) and Charles Babbage. It's her working relationship with Babbage that we particularly remember today.
Babbage was an eccentric but genius inventor whose obsessions with mechanisms led him to build a mechanical calculating device that is considered, by some, as the conceptual beginnings of the modern computer (check out the links about his inventions here—they're so beautiful modern computers look simply silly). Lovelace's translation of a study of Babbage's "analytical engine," which she heavily annotated with her own insights, included an algorithm for computing Bernoulli numbers she'd invented that could be processed by a machine. Essentially this is the first computer program, though the "code" and the machine itself bear no resemblance to your latest foray into C, D, Ruby, or whatever on your PC.
But it's hard to exaggerate how much of a conceptual leap Lovelace made in this and other theories, including her later desire to make a mathematical model of the human brain. Her ideas were dreamed up when many of these concepts were as intangible as the highest, strangest bits of science fiction. There's controversy about her contributions to this work, almost inevitably, but here's a thought for you today: How much code could you theorize that would run on a future computer that will not be designed until 2113?