London, June 1833. A dinner party at the home of Charles Babbage, the eccentric mathematician and inventor who had made his house on Dorset Street a gathering point for the most curious minds in England. Scientists, politicians, artists, and socialites mingled beneath crystal chandeliers, admiring the strange mechanical device that occupied one end of the room — a fragment of Babbage’s great unfinished Difference Engine, a tower of brass gears and levers that could, its creator claimed, calculate mathematical tables automatically without human error.

Most of the guests admired the machine the way one admires a clever conjuring trick — with pleasure, and without deep comprehension. They saw gears. They saw ingenuity. They saw a novelty.

One guest saw something else entirely.

She was seventeen years old. Her name was Augusta Ada Byron — daughter of the most famous poet in England, a man she had never met. She stood before the fragment of the Difference Engine and understood, with a clarity that would take the rest of the world a century to catch up to, exactly what it meant and exactly what it could become.

Her name, after her marriage, would be Ada Lovelace. And she would write the world’s first computer program for a machine that would not be built for another hundred and fifty years.


The Daughter of a Dangerous Man

To understand Ada Lovelace, you first have to understand the shadow she was born into — and the lengths to which the people who loved her went to keep that shadow from falling on her.

Her father was George Gordon Byron, the sixth Baron Byron — the most celebrated and the most scandalous poet of the Romantic age. Byron was a man of extraordinary gifts and catastrophic personal failings. He wrote with a beauty and emotional intensity that made him the rock star of his era. He was also spectacularly self-destructive: he drank, gambled, conducted affairs of staggering recklessness including, according to persistent rumor and some evidence, an incestuous relationship with his half-sister Augusta, and accumulated debts that threatened to consume everything he owned.

In January 1815, Byron married Annabella Milbanke, a serious, intelligent woman from a respectable family who believed — with a confidence that retrospect reveals as tragic — that she could save him. She could not. The marriage was a disaster from the beginning. Byron was by turns charming and cruel, tender and violent, present and catastrophically absent. On December 10, 1815, Annabella gave birth to their daughter — Augusta Ada Byron, named partly after Byron’s half-sister, a choice that would later seem bitterly ironic. Five weeks later, Annabella took the baby and left. She never went back.

Byron signed the separation papers and left England in April 1816, never to return. He died in Greece in 1824, at thirty-six, having gone there to support the Greek war of independence — a romantic gesture that ended with a fever rather than a heroic death in battle. Ada was eight years old. She would carry his absence like a second shadow for the rest of her life.

Annabella — Lady Byron, as she now was — responded to the catastrophe of her marriage with a fierce, determined, and somewhat ferocious plan for her daughter’s future. She was convinced that Byron’s failings were the product of his poetic, romantic, irrational nature. She was determined that Ada would not inherit them. And so she arranged for Ada to receive an education that was, for a girl of the 1820s, almost without parallel.

Ada was taught mathematics. Serious mathematics — not the decorative arithmetic considered suitable for young ladies of good family, but algebra, geometry, and eventually calculus. She was taught science. She was taught logic and rigorous analytical thinking. Annabella surrounded her with tutors who were themselves serious scholars. When Ada showed a gift for music, Annabella allowed it — but always paired it with mathematical study, as if the two disciplines might balance each other, reason keeping imagination in check.

What Annabella could not have predicted — what perhaps nobody could have predicted — was that her plan would produce not a safely rational young woman but a mind of extraordinary and unusual power: mathematical and imaginative simultaneously, rigorous and visionary at once. Ada would be, in the end, very much her father’s daughter. But the thing she would imagine was not a poem. It was the future of computing.


A Mind Taking Shape

Ada’s childhood was not easy. She was often ill — a pattern that would continue throughout her short life — and the illnesses were sometimes severe. At thirteen she developed measles so serious that it left her temporarily unable to walk. She spent nearly three years partially or wholly bedridden, and when she finally recovered full mobility it was with a determination and energy that suggested she had been storing it up through the long months of enforced stillness.

During her illness, confined to bed, Ada threw herself into mathematics with an intensity that alarmed some of her tutors and delighted others. She worked through problems in geometry and algebra not because she was told to but because she found them genuinely absorbing. She wrote to her mother about mathematical ideas with the same excitement another child might write about a game or a story. The numbers were not a chore. They were a world.

She also showed, from an early age, a quality that would define her contribution to computer science: the ability to think about abstractions concretely. She could hold an idea in her mind at a high level of generality and simultaneously see its specific implications and applications. This is a rare cognitive gift. Most people are good at one or the other — at the abstract or the concrete — but struggle to move fluidly between the two levels. Ada could do both, almost simultaneously.

This gift manifested in a remarkable project she undertook at age twelve. Ada decided she wanted to fly. Not metaphorically — literally. She began a systematic investigation of the problem of human flight, studying bird anatomy, the properties of different materials, and the physics of wings and lift. She filled notebooks with observations and calculations. She considered what materials wings might be made of — feathers, paper, silk. She thought about the proportions required. She called the project “Flyology.”

The project never got off the ground, literally or figuratively. But it reveals something important about how Ada’s mind worked. She did not see a problem — the problem of human flight — as inherently beyond human solution. She saw it as an engineering challenge: identify the principles, gather the relevant knowledge, design a systematic approach. This is the scientist’s and the engineer’s mindset. And in a twelve-year-old girl in 1826, when human flight was not achieved until the next century, it was remarkable.

She never lost this quality. And when she encountered Charles Babbage’s Analytical Engine, she applied exactly the same approach to a far more consequential problem: figuring out what a computing machine could actually do.


Enter Babbage

The dinner party of June 1833 was not Ada’s first encounter with the world of science and intellectual ambition — Annabella had been careful to expose her to that world — but it was the most consequential. Charles Babbage was already famous, in the way that brilliant, controversial, perpetually-frustrated men of science become famous: respected by those who understood what he was trying to do, dismissed by those who found him impractical, and funded insufficiently by everyone.

The fragment of the Difference Engine he displayed at his soirées was a demonstration piece — a small section of the full machine, which was never completed. But even the fragment was enough to convey the principle: that arithmetic calculation could be mechanized, that the grinding, error-prone work of computing mathematical tables by human hand could be replaced by a machine that would do it automatically and perfectly.

Most of Babbage’s guests admired the fragment and moved on. Ada Lovelace asked to come back.

This was not a small thing for a young woman of her era. Intellectual life in early Victorian England was heavily gendered. Women could attend soirées, could be educated, could even publish — but engaging seriously, as intellectual equals, with the leading scientists and mathematicians of the day was not something young women of good family typically did. Ada did it anyway, with her mother’s somewhat wary support.

She and Babbage formed a friendship that was one of the most productive and unlikely partnerships in the history of science. Babbage was nearly thirty years her senior, irascible, brilliant, perpetually distracted, and constitutionally unable to finish anything he started. Ada was young, often unwell, intellectually voracious, and possessed of a capacity for sustained, focused analytical work that Babbage himself lacked. They irritated each other regularly. They needed each other completely.

As their correspondence deepened over the following years, Ada became increasingly knowledgeable about the Difference Engine and then — as Babbage’s thinking evolved — about his far more ambitious new conception: the Analytical Engine.

The Analytical Engine was a machine of a completely different order from the Difference Engine. Where the Difference Engine was a specialized calculator — powerful, but capable of only a specific class of mathematical operations — the Analytical Engine was a general-purpose computing machine. It had a store, equivalent to memory, where numbers could be held. It had a mill, equivalent to a processor, where arithmetic operations were performed. It could be programmed using punched cards borrowed from the Jacquard weaving loom. And crucially, it could perform conditional operations — it could branch, meaning it could do different things depending on the results of previous calculations.

This is the architecture of a computer. Not an electronic computer — Babbage’s machine was mechanical, driven by steam — but a computer in the logical and functional sense. Every computer you have ever used, from the earliest mainframes to the phone in your pocket, is built on the same fundamental logical architecture that Babbage conceived for the Analytical Engine.

Ada understood this immediately. More than that, she understood its implications — understood, more clearly than Babbage himself, what it meant that a machine could be programmed to perform any computation whatsoever.


Turin, Translation, and the Notes That Changed Everything

The pivotal moment in Ada Lovelace’s contribution to computing came in 1842, when an Italian mathematician named Luigi Federico Menabrea attended a series of lectures Babbage gave in Turin about the Analytical Engine and wrote them up as a paper in French. Babbage’s colleague Charles Wheatstone suggested that Ada translate the paper into English.

This was, on the surface, a relatively modest scholarly task. Ada was fluent in French and mathematically sophisticated enough to understand the content. She was the obvious choice.

But Ada did not simply translate. She annotated.

The notes she added to Menabrea’s paper were nearly three times as long as the paper itself. They addressed aspects of the Analytical Engine’s operation that Menabrea had not covered, clarified points of confusion, explored implications that neither Menabrea nor Babbage had fully worked out, and — in the final and most extraordinary section, labeled Note G — contained what is now recognized as the world’s first computer program.

The program was an algorithm for calculating Bernoulli numbers using the Analytical Engine. Bernoulli numbers are a sequence of rational numbers with deep connections to number theory, and computing them is a genuinely complex mathematical task. Ada’s Note G laid out, in precise detail, the exact sequence of operations the Analytical Engine would need to perform to calculate them: which numbers would be stored where, which arithmetic operations would be performed in which order, how the machine would loop through repeated calculations, how it would handle the conditional branches required when different cases arose.

This was not a description of what the machine could do in general terms. This was a working program — a precise, detailed, executable sequence of instructions for a specific computational task. And it was written for a machine that had not been built, that Babbage himself had not fully worked out the details of, that would not be successfully constructed until the 1990s when the Science Museum in London finally built a working Difference Engine (not the Analytical Engine, but a related design).

The intellectual achievement is staggering. Ada was not programming a machine she could run and test. She was reasoning, purely in the abstract, about the behavior of a machine that existed only on paper — working out what it would do, step by step, under conditions she could only imagine. She was doing, in essence, what modern computer scientists call formal verification: reasoning rigorously about the correctness of a program without being able to execute it.

But the Notes contained more than a program. They contained ideas.


The Vision No One Else Had

Scattered through Ada’s notes are observations and arguments that were so far ahead of their time that they would not be fully appreciated until the computer age was well underway.

The most remarkable is her extended reflection on what the Analytical Engine could and could not do — and by implication, what any computing machine could and could not do.

Ada was aware that people would be tempted to over-claim for the Analytical Engine. Babbage himself had a tendency toward grandiosity about his inventions, and the popular press was even worse. Ada wanted to set the record straight. The Analytical Engine, she insisted, had no power of originating anything. It could only do what we know how to order it to perform.

This statement — careful, precise, and absolutely correct for the technology of her time — would become the center of one of the great debates in AI history. A hundred years later, Alan Turing would use it as a starting point for his own investigation of machine intelligence, calling it “Lady Lovelace’s Objection” and taking it seriously enough to spend several pages of his landmark 1950 paper arguing against it.

Was Lovelace right? Can a computing machine only do what it is explicitly programmed to do? Or can it, through processes complex enough and emergent enough, produce outputs that genuinely surprise its creators — outputs that could meaningfully be called original, creative, intelligent?

This question — which Ada identified and articulated in 1843 — is still genuinely open. When a large language model writes a poem that moves a reader to tears, or proposes a scientific hypothesis that a researcher finds genuinely novel, is it originating something? Or is it executing, at extraordinary scale and complexity, instructions that were ultimately given by its human designers and trainers?

There is no consensus answer. The most honest thing anyone can say is that Ada Lovelace asked the right question — the question that still sits at the center of debates about the nature and limits of machine intelligence — a hundred and eighty years ago.

But Ada did not stop there. In a passage that is almost vertiginously far-sighted, she suggested that the Analytical Engine’s operations on symbols need not be limited to numbers. The machine operated, at its core, on symbols according to rules. Those symbols happened to be numbers. But there was no reason, in principle, why they needed to be. If other things — musical notes, logical propositions, words — could be expressed as symbols subject to formal rules, then the Engine could in principle operate on them too.

This is the concept of general-purpose symbolic computation. And it is the concept on which all of modern computing — and all of modern AI — is built. Every application your computer runs, every website you visit, every song you stream, every conversation you have with an AI assistant, is ultimately reducible to symbolic manipulation according to formal rules. Ada Lovelace saw this in 1843, when the only computing machines in existence were mechanical, when the word “computer” referred to a person who did calculations by hand, when electricity had not yet been harnessed for communication let alone computation.

She also — and this is perhaps the most poignant passage in her notes — imagined a machine that could compose music. If the relationships between musical notes and chords could be expressed formally, she wrote, then the Analytical Engine could be used to compose elaborate pieces of music of any degree of complexity. This is not a casual aside. This is a genuine conceptual leap: the idea that a computing machine could be creative in a domain — music — that was considered quintessentially human.

AI music composition is a field that barely existed twenty years ago and is now a thriving area of research and commercial application. Ada Lovelace imagined it in 1843.


The Mind Behind the Vision

Who was Ada Lovelace as a person? History has tended to flatten her into a symbol — either the Romantic heroine, daughter of Byron, living fast and dying young in a blaze of poetic genius, or the austere proto-computer-scientist, a woman of pure reason inserted into the narrative of technology to correct its gender imbalance. Neither picture is quite right.

The real Ada was complicated, contradictory, and very human.

She was genuinely, seriously mathematically gifted. This is not a question of feminist revision of the historical record. The Notes themselves are the evidence. They are technically sophisticated, analytically rigorous, and conceptually original. Her correspondence with Babbage and with her mathematical tutors — including Augustus De Morgan, one of the leading mathematicians of the age — shows a mind that engaged with difficult material deeply and thoughtfully. De Morgan wrote to Lady Byron that Ada’s mathematical ability was of a quality rarely seen, and that if she had been a man, she would have been capable of becoming an original mathematical investigator.

But she was also impulsive, emotional, and prone to grand schemes that sometimes outran her ability to execute them. She developed a passionate interest in horse racing and attempted, with the help of a group of associates, to develop a mathematical system for betting on races. The project failed spectacularly, leaving her in serious debt that she concealed from her husband for as long as she could. She appears to have had affairs, though the evidence is fragmentary and contested. She was frequently unwell — racked by a series of ailments that in retrospect suggest an immune system that was never robust — and she used laudanum, the opium-based tincture that was the standard medical treatment for pain in her era, in quantities that almost certainly complicated her thinking and her health.

She was, in other words, fully human: brilliant and flawed, visionary and impractical, capable of extraordinary insight and capable too of ordinary human foolishness. The hagiographic versions of her story — the saint of computing, the prophet without fault — do her a disservice. The real Ada is more interesting.

Her relationship with her mother remained difficult and complex throughout her life. Annabella was controlling, anxious, and convinced that any sign of emotion or passion in Ada was the dangerous Byron inheritance asserting itself. She surrounded Ada with advisors and confidants — people she trusted, or believed she could trust — who reported back to her on Ada’s behavior and mental state. Ada both loved her mother and found her suffocating. The tension between them never fully resolved.

Her relationship with her husband, William King, later the Earl of Lovelace, was warmer and more straightforward. He was a decent, steady man who was genuinely fond of Ada, supportive of her intellectual work, and tolerant of her eccentricities. They had three children together. The marriage was, by the standards of the era and Ada’s own complicated nature, a relatively successful one.

And her relationship with Babbage was the intellectual partnership of her life. They argued constantly — about credit, about publication, about how to present the Analytical Engine to the public. In the period before the Notes were published, Babbage asked Ada to add a note of her own stating that the translation and notes had been completed at his request, implying that the work was done under his direction. Ada refused, firmly and in writing. The notes were hers. The ideas were hers. She was not going to present herself as an assistant to a project she had in crucial ways led.

This refusal matters. It is easy, from a distance of nearly two centuries, to frame Ada Lovelace’s story as one of a woman doing remarkable work in spite of the limitations of her era. And that is true. But it is also the story of a woman who knew her own worth — who insisted on her own intellectual independence even in a world that constantly pushed her toward deference — and who was right to do so.


The Erasure

The Notes were published in 1843 in an English scientific journal, attributed simply to “A.A.L.” — Ada’s initials. Whether this was to preserve her anonymity or simply convention is not entirely clear. But the effect was that one of the most important documents in the history of computing was published without a clear author.

It did not matter much in the short term, because nothing came of the Analytical Engine. Babbage never built it. He spent the remaining twenty-eight years of his life working on it, refining the design, arguing with potential funders, failing to secure the sustained support that would have been required. He died in 1871 with the Engine still unbuilt, his life’s great work unrealized.

Without a functioning machine, Ada’s notes had nowhere to go. They were a program without a computer — a precise, sophisticated algorithm for a device that existed only on paper. The ideas they contained were too far ahead of any practical technology to find an audience. They were read and admired by a small number of people who understood their significance, and then they were, for most practical purposes, forgotten.

Ada herself died in 1852, aged thirty-six — the same age her father had been when he died, a coincidence that she had apparently predicted. She had been suffering from uterine cancer, and the treatment — which included bloodletting and increasing doses of laudanum and other opiates — weakened her steadily through her final months. She died in November, with her mother at her side, having asked in her final weeks to be buried next to her father in the Byron family vault in Nottinghamshire.

For the next hundred years, she was a historical footnote. She appeared in biographies of Babbage as a supporting character. She was remembered, if at all, primarily as Byron’s daughter — a curiosity, the Romantic poet’s mathematical child. The Notes were occasionally referenced by historians of mathematics, but their significance was not widely understood.

Then, in 1953, a man named Bertram Vivian Bowden published a book called Faster Than Thought — an overview of the newly emerging field of electronic computing. In it, he reprinted Ada Lovelace’s Notes in full, drawing attention to them as a foundational document in the history of computing. The timing was perfect. The first electronic computers had been built less than a decade earlier. The field was young enough that its history was still being written, and old enough that people were beginning to want to write it.

Ada Lovelace was rediscovered.


The Recognition — And the Debate

The rediscovery of Ada Lovelace in the 1950s and subsequent decades produced something unexpected: a fierce argument about what, exactly, she had done and how much credit she deserved for it.

On one side were those who argued that she was the true visionary of the Analytical Engine project — that her conceptual grasp of the machine’s implications surpassed Babbage’s own, that the Notes were not just a translation with annotations but an original intellectual work of the highest order, and that she deserved recognition as the world’s first computer programmer.

On the other side were those who argued that Babbage deserved more credit than he typically received — that Ada’s contribution, while real, had been somewhat inflated by a narrative hungry for a female pioneer, and that Babbage had worked out most of the key ideas himself.

The argument generated considerable heat and, at times, more light. The honest assessment, based on careful historical scholarship, is something like this: Babbage conceived the Analytical Engine and its fundamental architecture. Ada understood that architecture with exceptional clarity and communicated it more lucidly than Babbage ever did. The algorithm in Note G — the program for computing Bernoulli numbers — was primarily Ada’s work, though she and Babbage corresponded extensively about it and he may have contributed to some sections. The broader conceptual insights in the Notes — the vision of general-purpose symbolic computation, the suggestion that the Engine could work with music and other symbolic systems, the careful discussion of what machines can and cannot do — are Ada’s.

That is enough. It is more than enough. She is, by any reasonable accounting, one of the most important figures in the history of computing — not because she is a convenient symbol, but because the ideas in her Notes were genuinely original, genuinely ahead of their time, and genuinely foundational to what came after.

The argument also missed something important: the significance of Ada Lovelace’s story is not just what she did, but when she did it. Writing a computer program in 2024 is a skill possessed by hundreds of millions of people. Writing a computer program in 1843, for a machine that did not yet exist, using only abstract reasoning and mathematical imagination — that is something else entirely. It is an act of intellectual imagination so extraordinary that it deserves recognition on its own terms, independent of any argument about gender or historical narrative.


The Legacy That Grew

In 1980, the United States Department of Defense named a new programming language “Ada” in her honor. The language was designed for safety-critical systems — software that controls aircraft, medical equipment, nuclear facilities, systems where a bug can kill people. It was a fitting tribute: a programming language for systems where precision, rigor, and the elimination of error are paramount, named after the woman who brought those qualities to the first computer program ever written.

The choice was not purely symbolic. Ada (the language) is still in use today in exactly those domains — aviation, military systems, rail transport — where reliability is non-negotiable. The name on the language is a constant quiet reminder of where the field of programming began.

In 2009, a blogger named Suw Charman-Anderson proposed a day dedicated to celebrating the achievements of women in science, technology, engineering, and mathematics. She called it Ada Lovelace Day, and proposed the second Tuesday of October as the date. It has been observed every year since, with events at universities, technology companies, and schools around the world.

Lovelace’s image appears on the hologram in the British Geological Survey’s staff identity cards. There are buildings named after her at universities across the English-speaking world. There are statues, portraits, and murals. There are plays, novels, graphic novels, and a substantial Wikipedia article that is among the most frequently visited pages on women in the history of science.

The recognition is, at this point, thoroughly established. But there is a risk, with any figure who becomes so thoroughly celebrated, that the person disappears behind the symbol. Ada Lovelace the icon — the Countess of Computing, the Prophet of the Digital Age — risks replacing Ada Lovelace the person: the young woman who was seriously ill more often than she was well, who gambled and lost, who argued with her collaborators and refused to be credited as less than she was, who died at thirty-six having written one document of enduring genius and spent the rest of her brief life trying to live up to her own abilities in a world that had very little space for what she was.


What Ada Lovelace Means for AI

Ada Lovelace died nearly a century before the field of Artificial Intelligence was formally established. She never used the term — it didn’t exist. She never saw an electronic computer. She never saw a programming language, a software interface, or a database. She would not have recognized any of the specific technologies that constitute modern AI.

And yet she is essential to the story of AI — not just as a historical footnote, not just as a symbol of women’s contributions to computing, but as someone whose ideas remain directly relevant to the deepest questions in the field.

Her argument that the Analytical Engine could only do what we know how to order it to perform is still the central challenge of AI development. Every AI system in existence today, no matter how sophisticated, is ultimately executing something that can be traced back to human decisions: decisions about architecture, about training data, about objective functions, about what counts as success. The question of whether any system can transcend those initial conditions — whether a machine trained by humans can genuinely originate something no human intended — is Ada Lovelace’s question, still unanswered.

Her insight that symbolic manipulation is domain-independent — that a machine operating on symbols according to rules is not limited to numbers — is the conceptual foundation of general-purpose computing and, by extension, of AI. Modern AI systems do not compute numbers. They manipulate representations: vectors, tokens, weights. But the fundamental principle — symbols, rules, transformation — is the one Ada identified.

Her imagination of a machine that could compose music anticipated a field that is now a major area of AI research and commercial development. AI systems that compose music, generate images, write poetry, and create video are all, in a lineage that can be traced from her Notes, expressions of exactly what she imagined.

And her refusal to let her contribution be minimized or attributed to someone else — her insistence on her own intellectual ownership of her own ideas — is a lesson that the history of technology still needs to learn. The history of computing, like the history of most fields, has a tendency to attribute discoveries to the most prominent, the most institutional, the most conventionally credentialed figures in the story. Ada Lovelace was none of those things. She was a woman, young, often ill, working at the margins of a field that barely existed. And she was right, and original, and important. Recognizing that is not an act of political correction. It is an act of historical accuracy.


The Question She Left Us

Ada Lovelace wrote one great document and then she died, and the document was forgotten for a hundred years, and then it was found again and the world realized what had been there all along.

There is something in that story that feels almost literary — the lost manuscript rediscovered, the prophet without honor in her own time, the ideas whose time had not yet come. And there is something in it that should make us cautious: how many other extraordinary ideas are sitting in archives, in footnotes, in the letters and notebooks of people whose circumstances — their gender, their class, their health, their race — prevented them from being recognized in their own time?

But the more urgent question Ada Lovelace leaves us is the philosophical one she raised and never fully resolved. She wrote that a machine can only do what we know how to order it to perform. And then she imagined, in the same document, a machine composing music — creating something new, something that its creators could not necessarily have specified in advance.

Was she contradicting herself? Or was she seeing something subtle: that the boundary between executing an instruction and originating something new might be less clear than it seems? That a machine following complex enough rules in a rich enough symbolic space might produce outputs that genuinely surprise even the people who built it?

This is the question that Alan Turing would take up a hundred years later. It is the question that Geoffrey Hinton, Yann LeCun, and Yoshua Bengio spent their careers trying to answer by building neural networks that learned rather than being explicitly programmed. It is the question that researchers are still arguing about today, every time a large language model produces something that feels — disturbingly, thrillingly — original.

Ada Lovelace did not answer the question. She was too honest and too rigorous to claim more certainty than she had. But she asked it first. And that is enough. Asking the right question, clearly and precisely, at the right moment — seeing the shape of the problem before anyone else sees it — is itself a form of genius.

She was thirty-six years old when she died. She had been ill for most of her adult life. She had written one landmark document, in her mid-twenties, for a machine that wouldn’t be built for a century and a half. She had been forgotten almost immediately.

She is remembered now. And the question she asked has never been more urgent.


Further Reading

  • “The Thrilling Adventures of Lovelace and Babbage” by Sydney Padua — A brilliant, funny, meticulously researched graphic novel about Ada and Babbage. The footnotes alone are worth the price.
  • “Ada’s Algorithm” by James Essinger — A clear, accessible biography focused specifically on Ada’s mathematical work and the Notes.
  • “The Innovators” by Walter Isaacson — Opens with an extended portrait of Ada Lovelace and traces the thread of her ideas through the history of computing.
  • Ada Lovelace’s Notes themselves — Available online in their original form. Note G, the algorithm for Bernoulli numbers, repays careful reading even if you do not follow every step.
  • “Enchantress of Numbers” by Jennifer Chiaverini — A historical novel imagining Ada’s inner life. Fiction, but carefully researched and emotionally true.

Next in the Profiles series: P2 — Alan Turing: The Man Who Invented the Future — A child who never quite fit in. A mathematician who changed the course of a world war. A philosopher who asked whether machines could think. And a man destroyed by the country he had saved. The full story of Alan Turing — the most important person in the history of AI.


Minds & Machines: The Story of AI is published weekly. If this profile moved or surprised you, share it with someone who should know Ada Lovelace’s name.