Archive

Archive for April, 2012

5 Ideas for Amazon AWS

Although the number of solutions Amazon AWS is offering has become very large, here are 5 ideas of what Amazon could be adding next.

API Marketplaces

There are thousands of APIs out there. However what is missing is an easy way for companies to control their costs. In line with other marketplaces Amazon runs, there could be an API marketplace. An API marketplace would allow third-party API providers to let Amazon do the charging. Companies would be able to pay one bill to Amazon AWS and use thousands of APIs. Also third-party API providers would be winning because they often can not charge small amounts to a large set of developers. Amazon already sends you a bill or charges your credit card, hence adding some dollar/euro cents for external API usage would be easy to do. The third-party API provider would avoid having to lock-in users in large monthly usage fees to offset credit card and management charges. Amazon of course would be the big winner because they could get a revenue share on these thousands of APIs. End-users would also be winning because they can easily compare different APIs and get community feedback from other developers and pick those APIs with the best reputation. The typical advantages of any online marketplace. Also cross-selling, advertisement, etc. and other areas can be reused by Amazon. A final advantage would even be to have Amazon be in the middle and offer a standard interface with third-parties offering competing implementations. This would allow developers to easily switch providers.

Language APIs

A lot of applications would be helped if they could use language APIs that are paid per request. Language APIs is a group name for text-to-speech, speech recognition, natural language processing, even mood analysis APIs. These are all APIs that are available individually but there is a clear economies of scale effect. The more speech you transcribe or text documents you process, the better your algorithms become. Also there is an over-supply of English language APIs but an under-supply of any other language in the world, except for Spanish, French and German perhaps. Another problem with existing APIs is that a high monthly volume is needed in the even the most basic subscription plan. Examples are Acapela VaaS pricing that costs a minimum of €1500. Very few applications will use this amount of voice.

M2M APIs and Services

Amazon is already working hard on Big Data solutions. M2M sensors can generate large volumes of data pretty quickly. S3 or DynamoDB would be ideal to store this data. However what is missing is an easy way to connect and manage large number of sensors and devices and their accompanying applications. There are few standards but with examples like Pachube, Amazon should be able to get inspired. Especially the end-to-end service management, provisioning, SLA management, etc. could use a big boost from a disruptive innovator like Amazon. Also M2M sensor intelligence could be offered from Amazon, see my other article about this subject.

Mobile APIs and Solutions

With billions of phones out there, mobilizing the Web will be the next challenge. Securely exposing company data, applications and processes towards mobile devices is a challenge today. BYOD, bring-your-own-device, is a headache for CIOs. We do not all have a MAC so we can not sign iPhone apps and launch them on the App Store. Ideally there would be a technical solution for enterprises to manage private app stores, deploy apps on different devices and be able to send notification to all or subsets of their employees. Also functionality like Usergrid in which developers would not have to focus on the backoffice logic would be of interest. Also tools to develop front-end for different devices would be appreciated, examples like Tiggzi come to mind. There are a lot of island solutions but few really integrated total solutions.

Support APIs and Services

Amazon is becoming more and more important in the global IT infrastructure business. This means that solutions will move more and more to the Cloud and sometimes be hybrid cloud. With these complex solution scenarios in which third-parties, Amazon and on-site enterprise services have to be combined, risks of things going wrong are high. Support services both from a technical point of view:

  • detect failures and to automatically try to solve them
  • manage support ticket distributions between different partners
  • measure SLAs
  • etc.

as well as from a functional point of view:

  • dynamic call centers with temporary agents
  • 3rd party certification programs in case small partners do not have local resources
  • 3rd party support marketplace to offer more competition and compare reputations
  • etc.

are all areas in which global solutions could disrupt local and island solutions that are currently in place.

Advertisements

How to dramatically reduce the amount of data M2M sensors transmit?

April 26, 2012 1 comment

M2M sensors are predicted to generate more data than their human counterparts in the coming years. Unfortunately the price that will be paid for moving this traffic will be substantially lower than human data traffic. So it makes sense to think about ways to dramatically reduce the amount of data M2M sensors transmit.

How to do it?

In recent weeks I have been playing with RapidMiner. This program might be soon installed on a lot of Windows machines next to MS-Office. RapidMiner allows complete data mining layman to easily get hidden information out of the data they have at hand in files, Excels, Access, databases, etc.

RapidMiner shows how with some simple drag-and-drop in 5 minutes you can use complex algorithms like Neural Networks, Support Vectors Machines, Bayesian Classifiers, Decision Trees, Genetic Algorithms, etc. to make sense out of data.

The fact that you can easily train an algorithm to take a decision on your behalf could be a key factor to reduce the amount of M2M sensor data. So instead of sending all the data to a central point and making decisions there, you would put intelligence into the sensors.

This artificial sensor intelligence would not only be limited to single sensor failure. By applying genetic and swarm algorithms and copying mother nature, you would be able to have different sensors behave like for instance an ant colony. Individual sensors would start sharing alarm data and if enough or the right sensors agree then they would launch collective alerts.

Wireless technologies based on for instance White Spaces technologies can be used, and are already used for instance in Cambridge, to cheaply have many sensors communicate with one another. Also harvesting techniques should be used to avoid having to install batteries into the sensors.

The last part of the puzzle would be extra features in a M2M PaaS to manage the distribution of intelligence for de-centralized and self-organizing sensor networks.  Sensors are likely to send data to a central server in which humans will have to train computers on what type of data is critical. Once the trained models are available, then they can be distributed to sensors. The M2M PaaS would focus from then on, on adjusting the algorithms in case certain alarms were not caught or when alarms were launched unnecessary.

What if there were no Over-the-Top-Players?

What would have happened to the telecom industry if there were no over-the-top-players [OTP] and disruptive innovators?

  • Nokia would still be the phone leader and touch screens would not exist.
  • Parlay would be the most easy to use telecom asset exposure API.
  • MMS would be the only way to share images.
  • On-site equipment would be necessary for each enterprise.
  • LTE would be a data highway without demand.

If this sounds borring then you are right. OTP and disruptive innovators have brought excitement again. People want to know about new phones, faster data plans, tablets, etc.

However the more important question is why are telecom operators and traditional telecom providers not able to innovate like OTPs?

There are many reasons: canabalization of existing business, too complex Backoffice architecture, etc.
However the main reason is mindset! Most people that have spend at least 5 years in the telecom domain have started to assume that telecom beliefs are as strict as fundamental laws of physics:

  • if it is not a standard it does not exist
  • if it is not telco grade and build out of expensive software and hardware it does not exist
  • if it is not controlled from the network it does not exist
  • if it has very high revenue and cash flow for me then it is good for customers
  • if it does not run in my data center and is fully integrated into all my BSS and OSS then I can not launch it

Unfortunately most operators you speak to still do not realize that these rules have been successfully broken by OTPs.

The same mistakes are still made every day: Joyn, WAC, OneAPI, etc. are all trying to be a standard before being successful, all have to be integrated into all Backoffice systems before launch, etc.

The day telecom operators realize that they have overpaid suppliers to deliver them a too complex architecture to be useful, implemented according to some standard body that is making things too complex in order to still be relevant, etc. will be the day OTPs should be afraid. Unfortunately this day looks like it will never come. If you would bet your house on one bet. What would it be? Telecom operators get over their flawed traditional beliefs or OTPs succeed in converting operators into bitpipes. There is still time to change the future but the future starts today…

Where should VCs invest?

If you are a VC and you are unclear where to invest then this post might be of interest to you.

Some Disruptive Technologies and ideas that startups might be working on or for which you might want to assemble a team:

Alternative networks

WiFi and 3/4/5G have their limitations. Any alternative networking technology that can change complete industries is probably a good pick. An example would be LiFi.
Networks as a Service – Software-Defined Networks – Openflow

This area is very hot at the moment. Today’s network are very hard to configure and manage, they are very tightly-coupled with hardware, they can not be extended easily.

Anything that makes Software-Defined Networks/Openflow easy for mass adoption is going to be a winner.

Anything that allows enterprises to buy a box once and get the network software later based on day-to-day business requirements, e.g. think about appstore for Openflow.

Anything that links Openflow to the Cloud.

M2M Disruptive Technologies
Printing electronics to make sensors cheaper.

Battery-free electronics to make sensors more mobile and less expensive to maintain.

Auto-discovery sensor mesh networks to avoid paying expensive 3/4G subscriptions.

M2M appstores to allow people to reuse the work others did.

Super-easy M2M APIs/PaaS. Look at Pachube as a model to beat.

Cloud Disruptive Technologies

Niche SaaSification in which applications that are only used in small niches can be offered as SaaS subscriptions in a global way.

Plug-and-Cloud Equipment for Hybrid Cloud & Exposure (Single Sign-on, Internal data sources, Internal integrations) – on-site equipment that allows enterprises in an easy and secure way to expose their internal assets to the Cloud e.g. employee single sign-on, secure exposure of company data, secure exposure and easy integration of company applications

Plug-and-Play SaaS integrations that allow multiple SaaS offerings to be easily integrated without programming.
Mobile

Mobile PaaS = mobile GUI drag-and-drop designer + no-programming back-end systems like Usergrid + plug-and-play integration with external and enterprise APIs + enterprise mobile app / SaaS stores + BYOD made easy solutions (some elements are optional)

Big Data / Data Analytics

Visual data miner as a service

Big Data PaaS (easy tools/APIs for complex big data operations like mood analysis, natural language processing, etc.)

Gamification/Crowdsourcing

Kaggle type of services but for other domains e.g. competition to create the easiest/best mobile interface or API

Kaggle + Kickstarter => competition together with crowdfunding. Who can build the best solution for this problem, gets their venture funded.

Nail-it-then-scale-it/Lean Startup type of crowdsourcing in which ideas get tested (e.g. paper prototypes, business model discovery, etc. before actual prototype) and funding is delivered bit by bit. Ideally with stock options of the funders in the new venture.
Enterprise/Consumer Telecom

Managed enterprise software-defined networks or BYOD – services that help enterprises to maintain their networks or devices that employees bring along in a managed way hence no experts need to be hired and the service is pay-as-you-go instead of CAPEX.

Cloud + Set-up Boxes – Appstores for ADSL/Cable Modem set-up boxes, SDKs to manage large sets of consumer’s set-up boxes, etc.

Conclusion

These are just a handful of ideas. If you want more or need more detail, let me know at maarten at telruptive dot com. Also if you are in need of an external adviser or executive in a new venture, let me now…

Data Analytics as a Service

April 18, 2012 2 comments

Every company is using Microsoft Office and especially Excel to do some sort of data analytics. However data volumes have grown exponentially and have outgrown Spreadsheets. You need experts in the business domain, in data analytics, in data migration/extraction/transformation/loading, in server management, etc. to get data analytics done on Big Data scale. This makes it expensive and only usable for the happy few.

Why? There must be easier ways to do it.

I think there are. For those unfamiliar with data analytics but eager to learn, you should take a look at a product called RapidMiner. It is close to amazing how a non-expert is able to use Neural Networks, Decision Trees, Support Vector Machines, Genetic Algorithms, etc. and get meaningful results in minutes. The amazing part is also that RapidMiner is open source hence for usage by 1 analyst it is free.

Rapid-i.com, the company behind RapidMiner, also offers server software to run data analytics remotely. It is here where big data opportunities meet easy data analytics. What if RapidMiner data analytics could be ran on hundreds of servers in parallel and you pay by usage just as you pay for any Cloud compute and storage instances?

RapidMiner as a Service

RapidMiner as a Service, RMaaS, would allow millions of business people to be able to analyse Big Data “without Big Investments”. This type of Data Analytics as a Service would provide any SME with the same data analytics tools as large corporations. Data could come from Amazon S3, Amazon’s DynamoDB, Hosted Hadoops, any webservices, any social network, etc.

Visual as a Service

RapidMiner as a Service is only one of the many domain specific tools that could be offered as a visual drag-and-drop Cloud service. VAS as a Service is another example in which complex telecom assets can be easily combined in a drag-and-drop manner. There are many more. These services will be the real revolution of Cloud Computing since they combine IaaS/PaaS/SaaS into a new generation of solutions that bring large savings for new users and potential large revenues for their providers…

Is IaaS a good business for operators?

The short answer is no unless you operate in a part of the world where there is no regional IaaS. The longer answer is:

Amazon is running their AWS services with a cost-plus pricing model. This means they aim for a 10% profit margin. Every time they have improvements in their economies of scale, they lower the price to get back to the 10%.

Although Amazon has healthy gross-margins, the IaaS is all about investing in hardware and R&D. This means that volume is the name of the game. Although Amazon is making IaaS into a billion dollar business, the number two player (Rackspace) is around $285M for their IaaS business. This shows the winner-takes-it-all.

How are operators going to compete?

Competing at price with Amazon AWS, Rackspace, Gogrid, etc. will not be an option given that they are lowering pricing continuously.

Competing with better technology is also almost impossible because Amazon is THE marketleader for IaaS innovation with services like DynamoDB. IT players are just doing catch-up and any operator that will use an RFQ process will just be buying previous-generation-software and hardware. This means in Cloud terminology: legacy systems.

Operators could give better SLAs then the 99.95% offered by Amazon. However in the world of cloud computing, SLAs do not mean anything. If you want availability, then you are better to implement a multi-cloud strategy in which you use multiple cloud providers and your software can move dynamically between them.

Trust? IBM and other IT players can provide that as well. They have been in the Cloud space for more time then telecom and are quicker at deploying technology.

Networking reliability, QoS and speed? Yes but only for a niche segment of the market. A segment that is unlikely to be big if you look at the local nature of most operators.

Geo-localization reasons? YES. This is probably the only valid reason why in Africa, some parts of Asia and Latin-America, operators should look at IaaS. However in Europe, the US, Australia, Japan, Korea, Singapore, etc. this can not be the driver.

So unless you are targetting some very specific low-latency or high data volume nice markets or are in a part of the world where reliable networking and electricity is hard to get, you are unlikely to make your CEO happy with IaaS. You should think about other parts of Cloud Computing like PaaS, business processes as a service, networking as a service, etc.

M2M sensors without batteries

April 11, 2012 2 comments

In the computing trend that will change everything, MIT’s Technology Review is showing how power consumption for computer resources has improved at the same speeds as computer chips. Especially the analogy of a Macbook Air with the efficiency of a PC from 1991 would run through a fully charged battery in 2.5 seconds. The trend is to continue with wireless no-batteries sensors. Sensors that get their energy from harvesting existing radio waves. Find out more on powering the internet of things without batteries.

Imagine the possibilities if sensors did no longer have to have batteries. Everything from traffic, the weather, human health, retail, etc. can be revolutionized. Mobile sensors will start generating massive amounts of data, called nanodata. The sensors are unlikely to hold a SIM because of the importance of energy efficiency. Operators should look for M2M business models that go further than only connectivity and should think about low-cost and low-energy wireless mesh networks instead of 3G/4G/5G…

%d bloggers like this: