Archive for November, 2014

Tax-avoiding Dotcoms playing Russian Roulette with their Stock Price

November 30, 2014 Leave a comment

I am watching “We’re not broke”, the documentary around US Unset and tax-avoidance by large corporations. Now the story behind all this is not new. Through history corporations have tried multiple times to put the chief financial officer in charge of generating better corporate results. A business executive would focus on understanding their customers better, giving them what they ask and selling them more. A technical executive on creating a blue ocean strategy through some technical innovation that puts the company in the centre of a new universe and makes competitors irrelevant. A financial executive however has only numbers to play with. Last time the CFOs cooked the books with aggressive revenue recognition. This time they are focusing on artificially lowering the tax bill via the creation of offshore shell companies that get all the profit even if they don’t have any employees.

A Russian Roulette Game with Stock Prices
A large corporation like GE or Bank of America is relatively safe from large groups of customers not being happy with the company’s corporate social irresponsibility [CSI]. What are people going to do? Change banks? Buy a fridge elsewhere? It is just not going to happen in big enough numbers to be of any impact on their profits.

Dotcoms however have a weakness that can put their stock price at risk if they want to be the king of CSI: people might actually do what they want them to do. Most of the big dotcoms get most money from advertisement. They put ads everywhere and teach people how to click them. Advertisers then pay per clicked ad lots of money to these dotcoms. However what if people in protest would massively start clicking advertisement banners but not buying the actual things behind them. The dotcoms would initially see their profits go through the roof but all its customers would see that they pay lots more money and get no value at all. Pretty soon stock prices would go in free fall. The irony would be that these protectors could use social networks and online videos to teach others how to join in the protest. So one advise to large dotcoms, please pay a responsible amount of taxes and focus your effort on out-innovating the rest of the industries and not on copying their bad habits…

Proximity Cloud, what is it and why you should care…

November 29, 2014 Leave a comment

Many people are still getting their head around public and private clouds. Even less know about Internet of Things. However the real revolution will be bringing both together and for that you will need a proximity cloud.

What is a proximity cloud?
In five years time there will be billions of sensors everywhere. You will be wearing them on your body. They will be in your house, at your job, in your hospital, in your city, in your car/bus/plane, etc.

Now in a world where 1 billion people will have access to 10’s or even 100’s of connected devices around them, it is easy to understand that you don’t want to send all data they generate to a public or private cloud. There is just not enough mobile spectrum or fiber capacity to cater for this.

What we need is put intelligence close to the data generators to determine if your heartbeat, that video camara stream, the electricity consumption of your boiler, the brake capacity of your car, etc. are within normal limits or are outliers that deserve some more attention. Especially if video is involved you don’t want to send it over a public Internet connection if it has nothing interesting on it, e.g. the cat of the neighbours just got onto your lawn.

So houses, cars, companies, telecoms, hospitals, manufacturers, etc. will need some new type of equipment close to the data avalanches in order to filter out the 99.999999% of useless data.

An example. If you are a security company that manages tens of thousands of video cameras in thousands of companies then you want to know when a thief walks by or an aggression happens. You will train machines to make decisions on what the difference is between a human walking past or a cat. However burglars will soon find out that computer vision has a fault and when they wear a cat disguise they can get past. It is this type of events that will trigger a central cloud platform to request all videos of a local business in the last 24 hours and to train its visual model that humans can wear animal suits and then push this to all proximity clouds in all its customer premises. The alternative is storing all video streams in the cloud which would require enormous bandwidth or even worse, not knowing what happened and being in the press the next week for being the “cat-suit” flop.

IoT revolution in the making…

November 29, 2014 Leave a comment

As announced previously if you are into IoT, proximity clouds, robotics, next-generation networking equipment, etc. then you want to subscribe to Telruptive because I plan on giving readers early access to IoT innovations that are non-public. This message will be the first of this kind. Ubuntu will make some major IoT announcements in the coming months. The type that change industries. So if you are working on something that can be connected to the Internet and can house an ARMv7, Intel Quark or better then please reach out to me on LinkedIn and tell me what you are working on. If it meets our innovation criteria then we will let you in on some of our secrets before anybody else. We want to give your innovation wings and make 2015 a magical year for innovators…

Eliminating RFPs to make enterprise software sexy

November 28, 2014 Leave a comment

Today I had a meeting that could be the beginning of the end of RFPs to buy software. RFPs are the tool established buyers and vendors use to keep new entrants at bay. However I haven’t met anybody that says they love writing or responding to them. The effect of RFPs on software is perverse. The main problem is that you can’t ask if your software is beautiful, easy to use, fast to integrate, efficient, effective at solving a business problem, secure, etc. Instead you ask if you provide training, because you assume it is ugly and difficult. You ask if they offer consultancy services and an SDK or connector library because you assume it is difficult. You assume you need to customise it for months because it will not be effective out of the box. But most importantly since you will be stuck with the software for years, you ask if it supports any potential feature that perhaps in 5 years might be needed for 5 minutes. It is this last set of questions that kill any innovation and ease of use in business software. A product manager in the receiving end will get funding to add those absurd features when customers ask for them. A career limiting move would be to ask for budget to reduce useless features or tell that your product looks worse than Frankenstein.

So how can you make sure that software is beautiful, does what it supposed to efficiently and effectively, is fast, nimble, easy to use, secure, scalable, fast to integrate, is future proof, etc.? You do what you do when you buy a car, you go and ask the keys of different models and take them for a serious spin and put them to their limits.

So what you propose is a three months PoC for each potential solution?
No what I propose is being able to get your hands on all different alternative software solutions and deploying, integrating and scaling them in hours or even minutes and then release a bunch of automatic performance tests and rough end-users, even some ethical hackers or competitors.

If the software does what it says on the tin, is effective, efficient, beautiful, secure, fast, scalable, easy, etc. then you negotiate pricing or use it for a minimum valuable product.

It used to be impossible to do all of this in hours but with solutions to deploy quickly private clouds and cloud orchestration solutions like Juju, we are actually planning on trying this approach with a real customer and real suppliers. To be continued…

Internet of Things Challenges and Opportunities

November 21, 2014 1 comment

IoT is one of the biggest potential new revenue streams but also one of the most challenging technical problems we have today.

the technical challenges

IoT is not only sensors + Big Data analytics + cloud + short-range low-energy networking and Internet. The real problem is that you have to be good at many different technologies that used to be separate and that one mistake can have disastrous effects. You have to be good at miniature sensors that need to be able to run two years on one tiny battery and use software that even the biggest geek hates to work on. At making sure IPv6 networking is adjusted to this small footprint devices with innovations like CoAP and 6LoWPAN. To learn about the world of micro-controllers open source hardware like Arduino, micro-computing platforms like Raspberry Pi and Edison, ARM Cortex, Intel Quark, etc. You also need to know about new and old low-energy networking technologies like Zigbee, Bluetooth Low-Energy, etc. Afterwards you want your sensors to be connected to a hub because otherwise you would need a SIM or Wifi in each sensor which would drain battery. So you need to make a smart hub that ideally can run apps from different developers and can support lots of new devices. However you also want devices to support peer-to-peer technologies like Thread or new standards from Intel, Qualcomm or any of the numerous standardisation bodies. You want to use 3D printing to print an attractive casing. You want to use crowd-funding to sell your Smarthub. You want mobile apps to work flawlessly with IoT. You need to know about Powerline, gesture control, in-building location tracking, voice control, etc. if you want to compete with the best smart hubs. You now need to know GPRS, 3G, 4G, White Spaces, Long-Haul Radio, WiFi or fiber broadband to communicate with the rest of the world. On the cloud side, being it public or a private OpenStack, you need to use the latest DevOps tools, Cloud Orchestration tools and containers like Docker, to deploy scale-out queues, real-time stream processing and other Big Data analytics solutions. You need to be able to train deep belief networks and push models to hubs and sensors. Recognise threatening video images. You need to be able to do rolling upgrades and continuous deployment of updates, developer apps, etc. Manage operations of millions of devices and billions of sensors. You want a store. A developer eco-system.
Now when you finally mastered all of this. Make one security mistake and a hacker on the other side of the world is able to control your house, business, city or country.

Now the business opportunities are huge as well. Save a couple of percentage on the production costs of a car and you can save hundreds of millions. Track a global epidemic or the vital signs of a billion people and you can save millions of lives. Give millions of developers a new way to channel their creativity and the Angry Bird of IoT will bring new industries. The one that changes the habits of people will be the next billionaire.

Where is the money? Industrial IoT. Where is the innovation? Home automation and wearables. You can’t pick one. You need to connect innovation with money if you want to lead the IoT revolution. If somebody else does it for you, they can make your solution irrelevant.

Several telecom operators to run into financial problems in the next three years…

November 21, 2014 Leave a comment

In 2017 several telecom operators will run into financial problems, with Vodafone being the most known, unless they start changing today. Why?

The telecom business is a very capital intensive business. Buying spectrum, rolling out the next-generation mobile networks and bringing fiber connections to each home and business is extremely capital intensive. Traditionally operators were the main users of their networks and got large margins on the services that ran on top of them. The truth today is that telecom operators have been completely sidetracked. They no longer have any control of the mobile devices that are used on their networks and neither the services. Data is growing exponentially and is already clogging their networks. A data tsunami is on the horizon. Operators see costs ballooning and ARPU shrinking. There is no way they can start asking substantially more for broadband access. Obama just killed any hope of adding a speed tax on the Internet. The EU wants to kill juicy roaming charges. However the future will be even worse.

New disruptive competitors have entered the market in recent years. Google Fiber is offering gigabit speeds both for uploading and downloading. Youtube and Netflix are generating the majority of Internet traffic in most countries.  Most streaming videos are broadcasted in SD quality. However Netflix is already broadcasting in 4K or ultra high-definition quality on Google Fiber. This means traffic volumes of between 7 to 19GB per hour depending on the codec that is used. Take into account that often different family members can be looking at two or more programmes at the same time. The end result is that today’s networks and spectrum are completely insufficient. Now add the nascent IoT revolution. Every machine on earth will get an IP address and be able to “share its feelings with the world”. Every vital sign of each person in the richer parts of the world will be collected by smart watches and tweeted about on social networks. 90% of the communication that is running inside Facebook’s data centre is machine to machine communication, not user-related communication. Facebook hasn’t even introduced IoT or wearables yet. You can easily imagine them helping even the biggest geek with suggestions on which girl to talk to and what to talk about via augmented reality goggles and with the help of smart watches. Yes it is a crazy example but which telecom marketing department would have given even $1 to Zuckerberg if he would have pitched Facebook to them when it was still known as TheFacebook. It is the perfect example of how “crazy entrepreneurs” make telecom executives look like dinosaurs.

This brings us to the internals on how telecom operators are ran. Marketing departments decide what customers MUST like. Often based on more than doubtful market studies and business plans. In contrast the mobile app stores of this world just let customers decide. Angry Bird might not be the most intelligent app but it sure is a money maker. Procurement departments decide which network and IT infrastructure is best for the company. Ask them what NFV or SDN means and the only thing they can sensibly response is an RFP identifier. Do you really think any procurement department can make a sensible decision on what network technology will be able to compete with Google? More importantly make sure these solutions are deployed at Google speed, integrated at Google speed and scale out at Google speed? If they pick a “Telecom-Grade Feature Monster” that takes years to integrate, then they have killed any chance of that operator being innovative. With all the telecom-grade solutions operators have, why is it that Google’s solutions are more responsive, offer better quality of service and are always available? Vittorio Colao, the Vodafone CEO, was quoted in a financial newspaper yesterday saying Vodafone is going to have to participate in the crazy price war around digital content because BT has moved into mobile. So one of the biggest telecom operators in the world has executive strategies like launching new tariff plans [think RED in 2013], pay crazy money to broadcast football matches, bundle mobile with fixed to be able to discount overall monthly tariffs and erode ARPU even more, etc. If you can get paid millions to just look at what competitors are doing and just badly copy them and dotcoms [the list is long: hosted email, portals, mobile portals, social networks, virtual desktops, IaaS, streaming video, etc.] then please allow me to put your long term viability into question.

So can it actually be done differently. YES, for sure. What if operators would enable customers to customise communication solutions towards their needs. Communication needs have not gone away, if any they augmented. Whatsapp, Google Hangout, etc. are clear examples of how SMS and phone calls can be improved. However they are just the tip of the iceberg of what is possible and what should be done. Network integrated apps via Telco App Stores would give innovators a chance to launch services that customers really like. Hands up who would pay to get rid of their current voicemail? Hands up who really loves their operator’s conference bridge and thinks it is state of the art? Hands up who is of the opinion a bakery is absolute not interested in knowing what its customers think about its products after they have left the shop?

Last week the TAD Summit in Turkey had a very special presentation from Truphone, one of the few disruptive mobile operators in the world. No wonder it won the best presentation award. Truphone, with the help of partners, deployed a telecom solution in minutes that included key components like IMS, SDP, HLR integration, one hundred numbers, dashboards, interactive voice responses, etc. Once deployed, the audience could immediately start calling and participate. All numbers of the people in the audience, their home operator, the operator that sold them their SIM initially, their age and responses to interactive questions were registered and results shown on a real-time dashboard. If the audience would have been in different locations, they could have been put on an interactive map as well. The whole solution took only a few weeks to build with a team of people that all had day jobs. The surprising thing is that it was all build with open source software. It is technically possible to innovate big time in telecom and bring to market new services daily. All at a fraction of today’s cost. The technology is no longer a limiting factor. Old-school thinking, bureaucracy and incompetence are the only things that hold back operators from changing their destiny. Whatever they do, they shouldn’t act like former-Nokia executives in some years and tell the world that Android and the iPhone took them by surprise. Dear mister operator, you have been warned. You have been giving good advise and examples of how to do it better. Now it is time to act upon them…

A Layman’s Guide to the Big Data Ecosystem

November 19, 2014 Leave a comment

Charles – Chuck – Butler, a colleague at Canonical, wrote a very nice blog post explaining the basics of Big Data. It does not only explain them but it also allows anybody to set up Big Data solutions in minutes via Juju. Really recommended reading:

This is a good example of the power of cloud orchestration. Some expert creates charms and bundles them with Juju and afterwards anybody can easily deploy, integrate and scale this Big Data solution in minutes.

Samuel Cozannet, another colleague, used some of these components to create an open source Tweet sentiment analysis solution that can be deployed in 14 minutes and includes autoscaling, a dashboard, Hadoop, Storm, Kafka, etc. He presented it on the OpenStack Developer Summit in Paris and will be providing instructions for everybody to set it up shortly.

%d bloggers like this: