5 Technologies that could become dangerous in the coming years

5-Technologies-could-become-dangerous-coming-years

The global investments in Technology will amount to $ 3.8 trillion in 2019, up 3.2% compared to 2018, according to Gartner. The consultancy points out that growth will be driven by a shift from the concept of information technology. Technological tools before seen as unlikely, in a few years will be present bringing resources and facilities. But beyond that, these technological advances can lead to consequences not yet foreseen.

 

Below you will find 5 technologies that deserve attention in the coming years.

 

1- Facial recognition

 

There are incredibly useful features for face recognition, but it can easily be used for illegal purposes. China is accused of using technology for racial profiling surveillance. The country not only cameras located pedestrians but also monitor and control the Muslims living in the country. Russia’s cameras scour the streets in search of “persons of interest”, and there are cases in Brazil in which facial recognition failed arresting people who were not really sought.

 

In addition, there is a concern in the unlimited deposit biographical and registration data that refer directly to a person. Data such as biological and behavioral characteristics that are analyzed in an automated way, can identify the person through gestures and behavior, as well as fingerprints and iris. The problem lies in the fact of this information falls into the wrong hands.

 

2- Smart Drones

 

The possibility of a drone intelligent enough to go into combat or coordinate search and rescue missions is approaching reality. Through them, you can minimize effort and achieve success efficiently for those types of mission. However, this technology could also be used for wartime missions, as guerrillas, combat and, in the wrong hands, could cause serious problems.

 

The US Department of Defense held military tests in October last year using a swarm of 103 micro-drones.

 

Improvements in the artificial intelligence system of drones enabled small robot groups to act together either under human control or fully automated. According to the test report, the micro-drones demonstrated advanced swarm behaviors such as collective decision-making, adaptive formation flying, and self-healing.

 

Drones

 

3- Artificial Intelligence Cloning

 

With the support of artificial intelligence (AI), all that is required to create a clone of someone’s voice is just a snippet of the person’s voice. Similarly, the AI can take multiple pictures or videos of a person and then create an entirely new video – cloned – that seems to be original.

 

Deepfake technology, for example, uses facial mapping, machine learning, and artificial intelligence to create representations of real people doing and saying things they never did. Now, it is being directed to “ordinary” people. Technology has advanced to the point of not requiring much data to create a convincing fake video, plus there are a lot more pictures and videos of common Internet people on social networks to use.

 

4- Fake News Bots
 

GROVER is an artificial intelligence system capable of writing a false report from just a newspaper headline. The OpenAI a nonprofit company created bots that produce news and works of fiction so convincing that the organization decided not to publicly disclose research to prevent misuse of technology. The possibilities brought to light by these bots are making any government lose sleep.

 

5- “Smart Dust”

 

The microelectromechanical systems (MEMS), the size of a grain of salt, have sensors, communication mechanisms, autonomous power sources, and cameras. Also called motes, this “smart dust” has a multitude of positive uses in health, safety and more, but it would be scary if used to control illicit activities.