• It is now possible to use the forum via the PWA (Progressive Web App) feature in browsers. You may get a message from your browser. It is not necessary to add this, but is a nice feature.
  • The Feature "Collapsible Sections" was now added.
  • LL
    Check out The Lifestyle Lounge Short Stories Competition!
    Deadline is July 26th, 2021 midnight, UCT..
  • Welcome to the forums! Take a second to look at our Beginner's Guide. It contains the information necessary for you to have an easier experience here.

    Thanks and have fun. -NF staff

Scientists Warn of “Bleak Cyborg Future” From Brain-Computer Interfaces



Researchers warn of the potential social, ethical, and legal consequences of technologies interacting heavily with human brains.

Surpassing the biological limitations of the brain and using one’s mind to interact with and control external electronic devices may sound like the distant cyborg future, but it could come sooner than we think.

Researchers from Imperial College London conducted a review of modern commercial brain-computer interface (BCI) devices, and they discuss the primary technological limitations and humanitarian concerns of these devices in APL Bioengineering, from AIP Publishing.

The most promising method to achieve real-world BCI applications is through electroencephalography (EEG), a method of monitoring the brain noninvasively through its electrical activity. EEG-based BCIs, or eBCIs, will require a number of technological advances prior to widespread use, but more importantly, they will raise a variety of social, ethical, and legal concerns.



A schematic demonstrates the steps required for eBCI operation. EEG sensors acquire electrical signals from the brain, which are processed and outputted to control external devices. Credit: Portillo-Lara et al.​


Though it is difficult to understand exactly what a user experiences when operating an external device with an eBCI, a few things are certain. For one, eBCIs can communicate both ways. This allows a person to control electronics, which is particularly useful for medical patients that need help controlling wheelchairs, for example, but also potentially changes the way the brain functions.

“For some of these patients, these devices become such an integrated part of themselves that they refuse to have them removed at the end of the clinical trial,” said Rylie Green, one of the authors. “It has become increasingly evident that neurotechnologies have the potential to profoundly shape our own human experience and sense of self.”

Aside from these potentially bleak mental and physiological side effects, intellectual property concerns are also an issue and may allow private companies that develop eBCI technologies to own users’ neural data.

“This is particularly worrisome, since neural data is often considered to be the most intimate and private information that could be associated with any given user,” said Roberto Portillo-Lara, another author. “This is mainly because, apart from its diagnostic value, EEG data could be used to infer emotional and cognitive states, which would provide unparalleled insight into user intentions, preferences, and emotions.”

As the availability of these platforms increases past medical treatment, disparities in access to these technologies may exacerbate existing social inequalities. For example, eBCIs can be used for cognitive enhancement and cause extreme imbalances in academic or professional successes and educational advancements.

“This bleak panorama brings forth an interesting dilemma about the role of policymakers in BCI commercialization,” Green said. “Should regulatory bodies intervene to prevent misuse and unequal access to neurotech? Should society follow instead the path taken by previous innovations, such as the internet or the smartphone, which originally targeted niche markets but are now commercialized on a global scale?”

She calls on global policymakers, neuroscientists, manufacturers, and potential users of these technologies to begin having these conversations early and collaborate to produce answers to these difficult moral questions.

“Despite the potential risks, the ability to integrate the sophistication of the human mind with the capabilities of modern technology constitutes an unprecedented scientific achievement, which is beginning to challenge our own preconceptions of what it is to be human,” Green said.

Reference: “Mind the gap: State-of-the-art technologies and applications for EEG-based brain-computer interfaces” by Roberto Portillo-Lara, Bogachan Tahirbegi, Christopher A.R. Chapman, Josef A. Goding and Rylie A. Green, 20 July 2021, APL Bioengineering.



Source:
 

dr_shadow

Trust me, I'm a doctor
Moderator
It's the next logical step, IMO. People are already online 24/7 via their phones, so it would just be convenient if the "phone" becomes an implant that you can't misplace and that leaves your hands free to do other things.

But you'd need to develop new multitasking skills so you can watch cat videos in your mind while you're driving down an expressway.

Power supply would also seem to be an issue. Do you need to put a cord into your neck before you go to sleep for the implant to work throughout the day?
 
It's the next logical step, IMO. People are already online 24/7 via their phones, so it would just be convenient if the "phone" becomes an implant that you can't misplace and that leaves your hands free to do other things.

But you'd need to develop new multitasking skills so you can watch cat videos in your mind while you're driving down an expressway.

Power supply would also seem to be an issue. Do you need to put a cord into your neck before you go to sleep for the implant to work throughout the day?

This is something that could happen. We are likely to figure out power sources that are inconceivable as AI becomes smarter than humans in 2045.
 
This is something that could happen. We are likely to figure out power sources that are inconceivable as AI becomes smarter than humans in 2045.
I read some time ago that AI will never truly become smarter than us, since no matter how fast it develops biology is still and will always be ahead. No idea of the credibility of such a bold statement but considering we're still struggling to make any machinery as complex as a single cell I'm not ruling it out.
 
I read some time ago that AI will never truly become smarter than us, since no matter how fast it develops biology is still and will always be ahead. No idea of the credibility of such a bold statement but considering we're still struggling to make any machinery as complex as a single cell I'm not ruling it out.
There are so many things that could go wrong. :krilldead
 

Mickey Mouse

Disney Overlord
It's the next logical step, IMO. People are already online 24/7 via their phones, so it would just be convenient if the "phone" becomes an implant that you can't misplace and that leaves your hands free to do other things.

But you'd need to develop new multitasking skills so you can watch cat videos in your mind while you're driving down an expressway.

Power supply would also seem to be an issue. Do you need to put a cord into your neck before you go to sleep for the implant to work throughout the day?
Your just saying this because by time this apocalypse comes, you are going to be to old to care or dead.:catippy
 
Am I the only user here who is deeply disturbed by this and actively against further developing this type of technology? Has anyone here seen science fiction movies where machines enslave and/or annihilate humans? Do they wish for those movies to become a reality?
 

Hand Banana

Congratulations! You reached the final villain.
Am I the only user here who is deeply disturbed by this and actively against further developing this type of technology? Has anyone here seen science fiction movies where machines enslave and/or annihilate humans? Do they wish for those movies to become a reality?
Bro shut the f*ck up. God you make me so angry every time I read one of your posts.
 

dr_shadow

Trust me, I'm a doctor
Moderator
Am I the only user here who is deeply disturbed by this and actively against further developing this type of technology? Has anyone here seen science fiction movies where machines enslave and/or annihilate humans? Do they wish for those movies to become a reality?

You are, indeed, the only user here who is deeply disturbed by this.
 

dr_shadow

Trust me, I'm a doctor
Moderator
Your just saying this because by time this apocalypse comes, you are going to be to old to care or dead.:catippy

I'm expecting that medical advances will give me a pretty good shot at reaching 100, so I plan my life on the assumption that I will live to 2089.

I.e. if a future event is scheduled to happen before 2089 (for example, the end of One Country – Two Systems in 2047), I assume I'll be around to experience it.
 
You are, indeed, the only user here who is deeply disturbed by this.
I don't know, I'm more disturbed we don't rather focus on hacking our biology on the genetic level first. Impants and brain chips are all good but I rather have a "naturally" extended lifespan and increased healing, it's definately the way harder approach and can backfire HORRIBLY but it's the better option in the long run.
 

Amol

Chief of Wisdom
As a software engineer,I could say that AI can never be smarter than Humans. Infact they are not even smart. They can never think.

AI however are faster which gives illusion of them being smarter. I guess when you can do something in minutes for which a human would need years, you can technically be considered 'smarter'.

AI can never actually get sentience or whatever. They will remain dumb on actual terms of thinking. I have coded on some part of AI. You literally gotta program AI for every small thing. Every time it comes across scenario that wasn't explicitly taught, it crashes.

Most powerful AI are just the ones who had been taught the most and unlikely to come across scenario that wasn't part of their programming.
 
Last edited:

Yami Munesanzun

I am now having ALL teh segzy.
Technology is built on logic and precision, but in reality that's the way we look at those two concepts. Biology, or rather life, is messy, awkward, imprecise and often appears illogical but works better than any machine ever will. Shows us we know nothing of how the universe works.
Real talk, tho.

It'd probably be more of a mind "copy-and-paste" than a mind "drag-and-drop"
 

Amol

Chief of Wisdom
I bet you drink way more coffee than me.
Is that a software engineer stereotype?
It is very accurate though.
We don't exactly have working hours. So drinking lots of coffee is the only way to function tbh.
But yeah as I said earlier scientists already proposed any AI won't ever get to the intelligence of a biological organism, unless of course we get into mind uploads but that's cheating.
AI going Skynet is an impossibility.

I am not saying thatAI won't ever end up committing mass murder but it really would be programmer leaving some bugs in code instead of AI actually deciding to kill humans.
 
Real talk, tho.

It'd probably be more of a mind "copy-and-paste" than a mind "drag-and-drop"
Right now the only way we know of scanning a brain to upload someone's mind into a machine requires to literally slice the brain into single celled thick slices and scan those :maybe

But yeah it won't be you, it'll be a copy of you up to the moment of upload, after that it diverges as the same second it awakes it already experiences the world in a different way.

Is that a software engineer stereotype?
It is very accurate though.
We don't exactly have working hours. So drinking lots of coffee is the only way to function tbh.

AI going Skynet is an impossibility.

I am not saying thatAI won't ever end up committing mass murder but it really would be programmer leaving some bugs in code instead of AI actually deciding to kill humans.
It's not an offensive one though, I've been called cola boy/man ever since high school because of my hyperactive attitude and need for energy, I'm just saying :hoho

Glitch Apocalypse! Quick write that down man!
 
Top Bottom