Hacker news had the following article at the top of their list: the wolf of wall tweet. It talks about algorihms that used a rumor and the options market to make millions in seconds. This article refers to Flash Boys as well, a famous book about high frequency trading. However the big news is that whatever the article talks about as being magical is not magical at all, you can do a lot more and you will read about examples later on.
How does tweet option buying work?
Advances in neural networking have led to Deep Belief Networks (DBN). DBN in some cases are able to do natural language recognition and other types of recognition better than human beings, or at least a lot faster. So a DBN that is trained to read from the Twitter firehose and scan lots of news articles will beat humans in speed. Add an interface to options trading and you have what the article describes.
Taking it to the next level – knowing the future of the economy
What if you would know economical facts with a high degree of certainty before anybody else, and not sub seconds but hours, days, weeks, even months before anybody else. This is what Internet of Things combined with DBNs and automated trading can give you. How? Imagine you are into trading car stocks. Via computer vision you are able to count events. There are lots of public street cameras that stream in real-time data about what is happening on the roads. Humans look at them to see if there is a lot of traffic. Computers can use them to recognize and count events. So what would happen if strategically picked street cameras get hooked up to DBNs, you would be able to count how many trucks leave a factory with cars. You would be able to correlate these events month after month with the revenue figures of car manufacturers and then correlate with their stock value. The car manufacturers will at the end of a quarter announce their profits and a key aspect of their success depends on how many cars where sold. If you would know weeks or a month in advance that the volumes of cars coming out of a factory have picked up dramatically then you know the stock value will go up. If you would buy minutes before the figures come out a large quantity of car stocks then trading algorithms will pick up on this and will make you loose lots of the potential profit. However if you can spread purchase orders over weeks in small quantities then HFT can not detect your strategy.
Street cams is only the beginning
Using street cams would only be the beginning. Add weather sensors and lots of other sensors and you can do magic at large scale and would have a magical dashboard of the real economy before anybody else. If you are interested in this subject be sure to reach out on LinkedIn…
1. Block chain
The block chain is the heart of digital currencies like Bitcoin. What most don’t realise yet is that the block chain will be used for managing everything from domain names, artist royalties, escrow contracts, auctions, lotteries, etc. You can do away with middlemen whose only reason of being is making sure they keep on getting a large cut in the value chain. Unless a middlemen or governmental institution adds real value, they are in danger of being block chained into the past.
2. Biometric security
A good example is the Nymi, a wearable that listens to your unique heart beat patterns and creates a unique identity. Even if people steal your Nymi, it is of no use since they need your heart to go with it.
3. Deep belief networks
Deep belief networks are the reason why Google’s voice recognition is surprisingly accurate, Facebook can tag photos automagically, self-driven cars, etc.
4. Smart labels
They are 1 to 3 millimetres small. They harvest electricity from their environment. They can detect people approaching within half a metre, sometimes even identify them and each product you will buy. Your microwave will not longer have to be told how to warm up a frozen meal.
A $35 Raspberry Pi 2 or Odroid is many multiples more powerful than the first Google server but the size of a credit card. Parallella is $99, same size, and almost ten times more coresP then the first Google server.
6. Apps and App Stores for Smart Devices
Snappy Ubuntu Core allows developers to create apps like mobile apps but to put them on any smart device from robots & drones to wifi, hubs, industrial gateways, switches, dishwashers, sprinkler controls, etc. Software developers will be able to innovate faster and hardware can be totally repurposed in seconds. A switch can become a robot controller.
7. Edge/proximity/fog clouds
Public clouds often have too much latency for certain use cases. Often connectivity loss is not tolerable. Think about security cameras. In a world where 4K quality IP cameras will become extremely cheap, you want machine learning imagine recognition to be done locally and not on the other side of the world.
8. Containers and micro-services orchestration
Docker is not new but orchestrating millions of containers and handling super small micro services is still on the bleeding edge.
9. Cheap personalised robots and drones
£35 buys you a robot arm in Maplin in the UK. Not really useful for major things except for educating the next generation robot makers. Robots and drones will have apps (point 6) for which personalised robots and drones are happening this year.
10. Smart watches and hubs
Smart hubs know who is in the house, where they are (if you wear a phone, health wearable or smart watch), what their physical state is (heartbeat via smart watch), what your face looks like and your voice. Your smart watch will know more about you then you want relatives to know. Today Google knows a husband is getting a divorce before they do [wife searches and uses google maps]. Tomorrow your smart watch will know you are going to have a divorce before you do [heart jumped when you looked at that girl, her heartbeat went wild when you came closer].
Many people are still getting their head around public and private clouds. Even less know about Internet of Things. However the real revolution will be bringing both together and for that you will need a proximity cloud.
What is a proximity cloud?
In five years time there will be billions of sensors everywhere. You will be wearing them on your body. They will be in your house, at your job, in your hospital, in your city, in your car/bus/plane, etc.
Now in a world where 1 billion people will have access to 10’s or even 100’s of connected devices around them, it is easy to understand that you don’t want to send all data they generate to a public or private cloud. There is just not enough mobile spectrum or fiber capacity to cater for this.
What we need is put intelligence close to the data generators to determine if your heartbeat, that video camara stream, the electricity consumption of your boiler, the brake capacity of your car, etc. are within normal limits or are outliers that deserve some more attention. Especially if video is involved you don’t want to send it over a public Internet connection if it has nothing interesting on it, e.g. the cat of the neighbours just got onto your lawn.
So houses, cars, companies, telecoms, hospitals, manufacturers, etc. will need some new type of equipment close to the data avalanches in order to filter out the 99.999999% of useless data.
An example. If you are a security company that manages tens of thousands of video cameras in thousands of companies then you want to know when a thief walks by or an aggression happens. You will train machines to make decisions on what the difference is between a human walking past or a cat. However burglars will soon find out that computer vision has a fault and when they wear a cat disguise they can get past. It is this type of events that will trigger a central cloud platform to request all videos of a local business in the last 24 hours and to train its visual model that humans can wear animal suits and then push this to all proximity clouds in all its customer premises. The alternative is storing all video streams in the cloud which would require enormous bandwidth or even worse, not knowing what happened and being in the press the next week for being the “cat-suit” flop.
Recently presented on TED, Aurasma is a mobile augmented reality app on your mobile that impresses everybody:
This is the future of mobile. You go to a museum and get all the info about the paintings in a live video put on top of the painting. You could get receipes on how to use a fruit or vegetable that you never prepared before. You get instructions on how to install your WiFi router. A lot of possibilities and most are still to be invented.
In a video posted on Youtube in January 2011, PHD student [now Dr. not surprisingly] Zdenek Kalal shows off his doctor’s thesis: Predator. Predator is a computer vision algorithm that shows how this nacent industry has matured in a few years.
Afterwards the face is automatically recognized. Even when the head is moved sideways.
Computer Vision is one of those domains that has been underutilized by most, except of course for Facebook, Google, etc. However in the age where people are moving from voice to video chat and even continuous live broadcasting, everybody that wants to add extra value towards end-users, or customers/advertisers, should be looking at the possibilities of computer vision. Imagine what is possible if you combine a Kinect or Leap with Predator: online advertisers and secret services ‘ paradise.
The whole video can be found here: