top of page
  • Bluesky--Streamline-Simple-Icons(1)
  • LinkedIn
  • Twitter

Authorship on the Eve of the Neural Age 

“Neuralink and I, we’re on the eve of something great, so that works out perfect, too. Also – Adam and Eve. God created Adam, and then gave Adam a helper, who is Eve. I’m Adam, in this scenario, and Eve is my helper. Together they cursed humanity. Maybe I will do the same, with Eve.” 



After a freak accident left Arbaugh entirely dependent on his family, Neuralink arrived with the promise of a better life – the unification of humans and machines. It worked; Arbaugh can now move a computer cursor with his mind, allowing him to type, browse the internet, return to school, and even play video games. The chip has given him back a measure of control once thought impossible. He calls it Eve


Neuralink is a part of a broader category of brain-computer interface neurotechnologies that create a direct communication pathway between neural activity and an external device. At its simplest, the chip translates the brain’s electrical signals into digital commands, allowing a person to control a cursor on their electronics. 


These implants are often marketed as a way to restore independence for people with paralysis, ALS or other neurodegenerative diseases. Yet beneath this humanitarian vision lies something far more ambitious, godlike even: the augmentation of human cognition itself. Soon AI will no longer be a tool we consult on a screen, but something we can summon within the mind. If just a thought can call on a generative model as easily as a memory, and if the same tools designed to restore the body could one day expand memory, accelerate creativity or give us the ability to interface directly with artificial intelligence (AI), then what happens next? What if the next upgrade is us?  


This raises a myriad of issues, including privacy, the security of neural data, informed consent, and even autonomy itself. But perhaps the most unsettling challenge is a legal one. We have long built our creative and economic systems on the idea that human work is inherently more valuable than machine output. These values are emulated in Canadian copyright law which is already struggling to deal with the emergence of generative AI models, clinging to the idea that only “original works” born of human skill and judgement deserve protection. For example, in CCH Canadian Ltd v Law Society of Upper Canada, the Supreme Court defined skill as the use of knowledge and practiced ability, and judgment as the capacity to evaluate and choose between options in producing a work. With the emergence of technologies that will further integrate humanity with machinery, how will we know whether an artwork is the product of human judgement and skill or the AI’s calculations? Copyright law cannot answer these questions. 


When our thoughts are increasingly shared with or shaped by intelligent systems, individuality begins to dissolve into a kind of second consciousness. The mind becomes a networked space, a node in a dialogue between biology and code. As Carys Craig and Ian Kerr write, authorship is not a solitary act but a relational one that depends on our social, and now, technological surroundings. In the neural age, that relation extends inward, to the circuitry of the self. 


If that’s true, then the future of copyright may depend not on proving what we have created, but how we created it. One could imagine a human provenance model as a digital watermark of cognition. Such a system would invert the current logic of authorship: instead of granting protection by default, creators might have to prove their work was not machine-made. Alternatively, perhaps the law itself will need a complete overhaul, one that recognizes hybrid or collective forms of creation where the line between self and system no longer holds. In that sense, Arbaugh’s name for his chip feels prophetic. Eve is both origin and departure, the beginning of a new kind of authorship born not from isolation, but from entanglement.  


The opinion is the author’s, and does not necessarily reflect CIPPIC’s policy position.

 
 
bottom of page