Friday 22 July 2016

Future Data Mining and the Future Human and Future Robot

Who needs more than basic education when peer to peer data transfer is normal?

Its a strange thing to think, but perhaps our current education system model will be irrelevant in a few years time? More so than it already is!

Neural networks over fibre optic cable might transmit the thoughts of tomorrow

We are looking at a situation of Human 2.0 in the near future, whether this will take the form of an autonomous humanoid, or incorporate a human psyche and brain material, or be in some other form, perhaps ethereal energy, is open to speculation.

Any human has an inbuilt problem, it has to waste years learning. Unlike a computer that can now seek data in milliseconds, the human has to acquire a library of information to function above the burned in motor skills and basic life support OS it needs to run the body. 

We know of learning robots, like the ones that monitor your internet browsing choices and supply like-minded suggestions to consider. These acquisition algorithmic robots through their machinations, create data profiles which are useful to those who analyse patterns of behaviour.

What if we were able to just upload the relative skills and knowledge instantly to our minds? The danger then is a question of relevancy, what to keep and what to discard? This would make us all capable of being super intelligent, as required, instantly.

But, why would we need to have this capability? Why should we devote our time to things that a robot itself could identify as necessary and divine from analysis?

The robot mechanic might determine a job on a car is required and instantly create a parts list and tools list, plus it then downloads the job instructions, completes the job and does the 'paperless' paperwork, providing a printed work receipt and bill for the customer.

That is taking the situation someway into the future, but having an Etheric link  to information does enhance the human.

Use of fibre optic cabling for neural networking gives speed of light data transfer, but with the advantage it does not generate the heat as in the human system of electrical energy at around 10 watts. Scientists have already chipped the brains of some volunteers direct to the cortex, but the ultimate goal is to have a human consciousness in a humanoid shell.

Imagine that you can leave your body as some people have experienced in an out of body esperience, myself included. That you perfectly function outside of the human chassis, now you are an ethereal spirit force of energy, that is your consciousness state, shows that your perception of your environment is external from your brain, your brain is just the data interface module that uses the information for you to operate with.

You are still alive, but as energy. Now, if you could take that 'cloud' of spiritual and ethereal energy and consciousness and connect it to a broad spectrum parallel processing computer, you would then have true genius level. On two legs.

The problem with most computers as Professor Michio Kaku points out is that they have a data 'choke point' in the processing architecture. Essentially, the data is forced through a small pipe, but if you pushed the data through pipes laid side by side, your capacity is multiplied fantastically, as is your productivity. So, single linear processor to multiple linear units in comparison. 

And that is the current problem with the artificial brain, it is trying to use an unsuitable form of architecture to break out of the constrictions. Think of the data flow as water through pipes, by using incorrect fluid transfer dynamics, you are ensuring that your system is inefficient.

But how does this work with big data? The internet of things, of connected things more correctly is progressing. The collection of user data is big business. When this is really on stream, it will be a major driver in commerce.

Big data will allow producers of perishable goods to pick and pack at premium windows of freshness, automated harvesting will ensure a 24 / 7 type of continuous and structured supply which will minimise wastage and maximise profitability for the supplier and supermarket.

Big data will be present in most of our consumable products, from a smart label to a cheap microcircuit data tag. Indeed, internet capacity will have to be increased massively to cope with the major amounts of data in transmission and in data storage.

Whilst we make advances on the one had, we create architectural problems on the other. This big data storage cost will be offset by the need for acquisition of big data analytics, which may include telemetry data on products as diverse as cars to washing machines, recording how the machine is used, serviced and maintained. Because this data is ultimately valuable to someone, much like SEO data of today, Big Data is the thing for tomorrow. Real data with actual machine identification by geographical location and IP location.

From this data, the manufacturer will be able to engineer solutions and for the warranty industry, the telemetry data will provide a unique record of how the machine was treated. 

We are on the verge of a two speed world, those in the data tent and those who are not.

Our current education system is based on an out of date 1950's model, it is not relevant anymore, the future of work and human life is something we should be looking at futurising and analysing, making plans for.

In our lifetimes, peer to peer data acquisition may be possible, certainly, as a tool to deal with Alzheimers and Dementia, by making back up copies of neural networks and data in the brain, we can almost eliminate those diseases from impacting on humans by providing undisrupted synthetic neural networks which are easily replaced if damage occurs.

Like a blood transfusion or dialysis machine, 'backing up your brain' will be as normal a process. Indeed using an ethereal  energy 'spirit' based model, the brain would never have to actually age or deteriorate. Much like the computer's 'clipboard' RAM holding area, the backup of your brain can be uploaded to an undamaged sector.

Indeed, it will avoid the 'old school' disk defragmentation problems of yore, by an automatic optimisation of space allocation and indexing system.

Like a computer, the brain would just keep giving the answers until fault was recognised. And any software 'instructions' damaged by a computer version of Dementia would not be so devastating. Like a computer, you would just reload the main operating system and perhaps the back up copy of the human data.

This is likely to be the future, if you are brave enough!

No comments:

Post a Comment