The intersection of human biology and advanced technology offers glimpses into a future once confined to science fiction. Neuralink, the neurotechnology company founded by Elon Musk, stands at the forefront of this frontier, developing brain-computer interfaces (BCIs) with the potential to revolutionize treatment for neurological conditions. While the technology is still in its nascent stages, the personal stories of early trial participants provide powerful insights into its impact. One such compelling account comes from Brad Smith, who shared his experience as the third person, and the first with ALS, to receive the Neuralink implant.

Brad Smith: A Pioneer’s Perspective
In a video released detailing his experience, Brad Smith introduces himself not just as a Neuralink recipient, but as the first person with Amyotrophic Lateral Sclerosis (ALS) and the first non-verbal individual to participate in the trial. ALS, as Brad explains, is a devastating disease that progressively destroys motor neurons, leading to muscle control loss while leaving cognitive functions intact. For Brad, this means complete reliance on a ventilator for breathing and the inability to move anything besides his eyes. Communication, therefore, is entirely dependent on assistive technology.

Before Neuralink, Brad used an eye-gaze control system. While a “miracle of technology” in its own right, he found it frustrating, noting it worked best in dark environments, humorously comparing himself to Batman confined to a cave. The Neuralink implant, however, has offered him newfound freedom. He highlights the ability to use his computer regardless of lighting conditions, allowing him to go outside more freely.
Brad’s narration itself is a testament to technological advancement – it’s his original voice, cloned using AI from recordings made before ALS took his ability to speak. He uses the Neuralink BCI to control the mouse cursor on his MacBook Pro, enabling him to edit the video testimony – potentially the first video ever edited using a BCI. This level of control signifies a major step forward in restoring digital autonomy.
How Neuralink Works: Brad’s Explanation
Brad offers a clear, user-friendly explanation of how the Neuralink system functions for him:
- The Implant: A device roughly the size of five stacked US quarters is implanted in the motor cortex, the brain region controlling movement. This involved replacing a small piece of his skull.
- Electrode Threads: A surgical robot meticulously inserts 1,024 ultra-fine threads, each containing electrodes, a few millimeters into the brain tissue, carefully avoiding blood vessels to minimize bleeding.
- Data Capture: These electrodes detect the electrical signals (neuron firings) associated with intended movements, capturing this data every 15 milliseconds. Brad describes the raw feed as looking like “the Matrix.”
- Signal Processing: The implant transmits this vast amount of raw data wirelessly via Bluetooth to a connected MacBook Pro.
- AI Decoding: Sophisticated AI algorithms on the computer process the signals, distinguishing the user’s intended movement signals from background noise. Crucially, Brad emphasizes that the system decodes his intent to move the cursor, not his thoughts or internal monologue.
- Cursor Control: The decoded intention translates into real-time movement of the mouse cursor on the screen.
Training and User Experience

Making the system intuitive requires training and calibration:
- Initial Training: Brad trained the system using a simple game where he moves the cursor to on-screen “bubbles.” Yellow bubbles require hovering, while blue ones require a click.
- Finding the Right Control: Initially, the team tried decoding intended hand movements, but this proved ineffective for Brad. Through careful mapping of brain signals to attempted movements, Neuralink engineers discovered that Brad’s intended tongue movements provided the best signal for cursor control, and jaw clenching was optimal for clicking. Brad notes this control becomes subconscious over time, much like using a physical mouse.
- Performance Metric (Webgrid): Neuralink uses a test called Webgrid to quantify the accuracy and speed of intention decoding, measured in bits per second (BPS). Brad achieved a peak score of 5 BPS, a significant improvement over the less than 1 BPS he experienced with eye-gaze technology.
- The Mixer: A software tool allows fine-tuning:
- Bias Correction: Adjusts for natural drift in cursor control caused by the brain’s constantly changing signals – a feature refined through human feedback, unlike possibilities in earlier animal trials.
- Speed, Friction, Smoothing: Controls how quickly and smoothly the cursor moves.
- Click Stiffness: Adjusts how deliberate the “click” intention needs to be.
- Communication Tools:
- Neuralink Keyboard: A virtual keyboard optimized for BCI use, including predictive text.
- Custom Keypads: Brad utilizes the Mac’s accessibility keyboard to create custom panels for frequently used shortcuts (copy, paste, undo, etc.).
- Parking Spot: A feature Brad requested, allowing him to “park” the cursor (by moving it to the screen’s corner) so it doesn’t interfere when he’s watching videos or resting. This was vital as, unlike previous participants, he couldn’t use voice commands to pause it.
- AI Chat Assistant: To bridge the gap between thought speed and typing speed, Brad uses a chat application. It listens to conversations and employs AI (which Brad refers to as using ChatGPT and an AI clone of his voice) to generate relevant response options quickly. He shares a humorous example where it suggested telling a friend seeking gift ideas for a horse-loving girlfriend to “get her a bouquet of carrots.”
Bradford G Smith’s Video About His Neuralink Experience
Neuralink: The Broader Vision and Current Status
Neuralink’s overarching goal is to create a high-bandwidth, safe, and reliable brain-computer interface. Founded by Elon Musk, the company aims initially to restore capabilities like communication and environmental control for individuals with severe paralysis. Longer-term ambitions extend to potentially addressing blindness, deafness, and other neurological disorders, and perhaps even augmenting human capabilities.
The system comprises the N1 implant (the device in the brain) and the R1 surgical robot designed for precise, minimally invasive implantation. The company faced scrutiny over its animal testing protocols before receiving FDA approval for human trials, known as the PRIME (Precise Robotically Implanted Brain-Computer Interface) study, which began in 2023. Noland Arbaugh was the first publicly known participant, demonstrating gameplay and computer control earlier in 2024. Brad Smith’s testimony adds another crucial layer of human experience to the technology’s development.
Impact and Future
Brad Smith’s story is a powerful illustration of Neuralink’s potential. Beyond the technical specifications, it’s about restoring connection, independence, and hope. He speaks movingly about how the technology has given him “freedom, hope, and faster communication.” He views his participation, facilitated by a move to Arizona where Neuralink established a site, as part of a larger plan, allowing him to contribute to something that could help many others.
While acknowledging that “ALS still really sucks,” Brad emphasizes the positive impact on his life – enabling him to work with the Neuralink team, improving his ability to interact, and strengthening his bond with his wife, Tiffany, whom he credits as an essential part of his journey. His experience underscores the iterative nature of BCI development, where user feedback directly shapes features like the “parking spot.”
Brad Smith’s journey with Neuralink highlights the profound personal significance of this emerging technology. While challenges remain and the road to widespread application is long, his experience provides a tangible example of how BCIs could dramatically improve the quality of life for individuals facing severe physical limitations, truly bridging the gap between mind and machine.