From the Founder and CEO, Cloudscale Inc.

Bill McColl

Subscribe to Bill McColl: eMailAlertsEmail Alerts
Get Bill McColl: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Top Stories by Bill McColl

Cloud Data Analytics on Ulitzer In the previous article we looked at how realtime cloud analytics looks set to disrupt the $25B SQL/OLAP sector of the IT industry. What are users looking for from a next-generation post-SQL/OLAP enterprise analytics solution? Let's look at the requirements: Realtime + Historical Data. In addition to analyzing (historical) data held in databases (Oracle, SQLServer, DB2, MySQL) or datastores (Hadoop, Amazon Elastic MapReduce), a next-gen analytics solution needs to be able to analyze, filter and transform live data streams in realtime, with low latency, and to be able to "push" just the right data, at the right time, to users throughout the enterprise. With SQL/OLAP or Hadoop/MapReduce, users "pull" historical data via queries or programs to find what they need, but for many analytics scenarios today what's needed instead, to handle ... (more)

NoHadoop: Big Data Requires Not Only Hadoop

Over the past few years, Hadoop has become something of a poster child for the NoSQL movement. Whether it's interpreted as "No SQL" or "Not Only SQL", the message has been clear, if you have big data challenges, then your programming tool of choice should be Hadoop. Sure, continue to use SQL for your ancient legacy stuff, but when you need cutting edge performance and scalability, it's time to go Hadoop. The only problem with this story is that the people who really do have cutting edge performance and scalability requirements today have already moved on from the Hadoop model. A ... (more)

The Intercloud: Turning Computing Inside Out

The intercloud turns computing inside out. With traditional IT, we move the data to where the computing infrastructure is located. With the data volumes in most application areas now growing exponentially, this IT model is now broken. Moving massive volumes of data around means more bandwidth, more storage, and more latency. We need instead to position the computing infrastructure next to where the data is located. With intercloud computing, we can build global apps and services where a single app can operate on data that may be spread across many public clouds and private datace... (more)

Cloud Computing for Everyone

Early Bird at CLoud Expo In computing, big revolutions happen whenever a new technology comes along that enables everyone to do something that could previously only be done by a small number of technology experts, or only by those with tons of money and technical talent. Personal computing (Microsoft, Apple), Publishing (Adobe), Search (Google), Video (YouTube), News/Journalism (blogs) are all examples of this kind of disruptive revolutionary change. What's the next big game changer? In a word - Apps! We are about to move to an era in which everyone will be able to build their o... (more)

The Economics of Big Data: Why Faster Software is Cheaper

In big data computing, and more generally in all commercial highly parallel software systems, speed matters more than just about anything else. The reason is straightforward, and has been known for decades. Put very simply, when it comes to massively parallel software of the kind need to handle big data, fast is both better AND cheaper. Faster means lower latency AND lower cost. At first this may seem counterintuitive. A high-end sports car will be much faster than a standard family sedan, but the family sedan may be much cheaper. Cheaper to buy, and cheaper to run. But massively ... (more)