BaseN

14.02.2011 - The Immaterial Phone

On Friday Nokia announced that they'll introduce a third operating system - Windows Phone 7 - to some of their high end devices. Compared to serious competition which has either Google Android or Apple iOS, I find this move puzzling. How does one keep up with the evolution of new features when supporting three different OS's?

I've been a Nokia phone user since my Mobira Talkman in the late 1980s, mainly due to Nokia's way of providing me with increasing capabilities at a steady pace. With today's N900 I am able to use every IT system of our company, with an encrypted VPN. In addition, the underlying Linux allows for full scale administration access to our BaseN Platform, a feature not often used - but the capability is there.

However, I've always found changing to a new phone troublesome and time consuming. Calendar, address book, applications and other settings need to be synchronized and this rarely succeeds well with the provided migration applications.

By introducing an immaterial phone, Nokia could, with its strong roots in telecommunications, become a game changer once again. Imagine if the software of your phone would actually exist primarily in the computing cloud, decoupled from hardware? You could then invoke it with a web browser, tablet computer or in a 'surrogate' phone whenever necessary - your settings always being in sync. Upon hardware failure or loss, you would just tell the cloud service the serial number of your new hardware and in a few moments your environment would return. Or you could have multiple synchronized phones.

This would also allow for novel ways to interact with your operator - with IP coverage (WLAN hotspot for instance), all GSM/CDMA traffic could be transported over it, altering the international roaming cartel quite radically. Protocols and technologies for this are quite ready, as we can see from growing deployments of Femtocell base stations connected to consumer broadband.

The existing Ovi portfolio, in parallel to similar services at Google and Apple already offers data storage and backup services, but their success has been limited as they've been marketed only as add-on features. Game changing would require a whole-hearted effort to introduce a truly new phone concept. Someone will do it sooner or later, and I'm hoping it'll be Nokia.

//Pasi

06.08.2013 - Technological Wasteland

Intelligence gathering and services have existed since the earliest civilizations. Until now, however, those have been relatively expensive, requiring a lot of manpower and structures.

Todays's Internet, in turn, provides a very affordable, virtually free data warehouse for intelligence operatives. Recently revealed PRISM and XKeyScore probably carry a combined price tag with a smaller number than a single foreign country unit in the 1980s. Echelon, which caused some stir in the 1990s, was still a formidable investment.

Although I feel strongly about a small government, citizen privacy and free speech, these are in my opinion not the biggest issues when it comes to this massive surveillance of foreign countries, their companies and people - I'm concerned about innovation on the global scale.

If all (national) research and development data is immediately available to a foreign intelligence agency and their contractor companies, it is just a matter of time when groundbreaking innovations and discoveries start to happen in a country that has its own RnD plus everything from, say, a friendly European Union. And I'm not talking about state-industrial espionage which is illegal per international agreements. I'm talking about young people becoming subject matter experts in those vast intelligence organizations, taking next steps on their careers as engineers and inventors at existing and future companies.

My postulate is that a country, or a continent which becomes a technological wasteland steadily degrades into a problem and conflict area. When millions of young people tell that their dream is to work for the government as nothing else pays off (as happened in Egypt), a conflict is at the door.

In the US, National Security Agency (NSA) also has the responsibility for supporting US companies in keeping their confidential data safe. They've introduced eg. Security Enhanced Linux modules, SHA-2 encryption algorithms and a lot of training for technology companies, both private and state-owned.

Here in the EU, we urgently need to educate our research institutes and companies to excel in strong data encryption and related security technologies. Keeping data private should be a citizen skill, fostered by the government. If and when we are to maintain our innovation capabilities.

//Pasi- PGP Public Key 551D0D20

11.06.2013 - Consumption.. Now

People in intelligence branches tend to say that the value of reconnaissance data decreases exponentially by its age. Although historical analysis is important, having real time data available is critical for evolutionally better decision making. Human brain's short term memory just drives our primary cognitive functions.

We're experiencing the same inflation when our utilities present us with 24-to-48 (or sometimes up to two months) delays in energy and water consumption data. This data is no longer actionable and usually just makes us feel guilty - for a while, as the short term memory has an efficient garbage collection system.

Last week we were discussing with one of the largest local utilities about our services and their relation/integration to the existing Automatic Meter Reading (AMR) and Meter Data Management (MDM) systems. When we inquired about their near future requirements and wishes, the initial answer was somewhat startling:

"We'd like to know how much electricity our customers.. are using.. Now."

Well that's something we deliver by default. The current AMR/MDM structures are too often bound to legacy billing cycles of 1-2 months. Smart meters can and will do much better, if the infrastructure is designed to be real time from the beginning.

//Pasi

11.07.2013 - Root Cause Determinism

Most modern software applications, internal or external to organizations, tend to become highly complex over time, when it comes to physical and logical servers, databases, front ends and other components. In order to troubleshoot these, many companies offer shrink-wrap products that promise to find 'Root Cause' for any current or even new performance and reliability issues.

These products work well for, eh, shrink-wrap applications which also have deterministic, shrink-wrap issues pre-built into the monitoring product's problem database.

Our experience is that even a basic CRM application is usually customized, or built into a slightly different networking environment so that no canned product can find those real performance bottlenecks.

We've always taken a different approach. In cases where our Platform is primarily used for performance tracking, we enable as many data feeds as possible from all software and hardware components in and near the target system. This easily generates few hundred kilobits per second data flow, which is then templated and visualized in real time. All visualization components are modular and thus the performance view can be very quickly adapted to match the application structure.

According to Donald Rumsfeld (who clearly has read his Clausewitz) there are known knowns, known unknowns and unknown unknowns. With application features and complexity increasing it becomes more and more critical to scalably measure and infer the latter two.

//Pasi

12.03.2013 - Pseudoservices

My utility encountered a severe winter storm a few months ago, causing power outages for thousands of homes. Most mobile networks, though, were still up and running as many base stations have battery and generator backups. Many people have smartphones, so they directed their browsers to the utility's website and tried to seek information about the outage and possible repair time estimates. The utility, like most others, has a graphical outage map which was paraded in the media just a couple months earlier. So everything was supposed to be in order.

However, their outage visualization system collapsed after a few hundred simultaneous requests, which subsequently rendered their whole website unavailable.

My first thought was that this is just a sizing and configuration problem. However, when I'm now looking at similar, e.g. energy consumption portals of utilities, a different thought arises. I think these portals have been badly designed on purpose. Giving customers real time information could generate inconvenient questions about the goals, preparedness and real technological level of the utility, so an 'unfortunate' website overload is a good firewall for criticism, at least for now.

This is one of the few places where regulation would help. With these energy prices, customers should get accurate and real time information about their service level and consumption, through systems that can cope with the ways people are now accustomed to. Unsurprisingly, one of the best performing public services is the tax authority, which now sports totally electronic interfaces towards most companies and people. Scalability and user friendliness is there. Yes, when money is collected.

It is now time to get other essential services to the same level. If the tax authority can process 5M records in real time without a glitch, utilities can provide visibility to their services in seconds, without 24h or even one hour delays. Information blackout is not the solution.

//Pasi

15.01.2013 - Big Data Cloud Pioneers (10 + 11)

When NASA launched Pioneer 10 and 11 outer space probes in 1972 and 1973 respectively, local computing and storage were extremely expensive compared to today's resources. That's why it was logical to make them both fully Cloud-controlled, using NASA's Deep Space Network. Their software versions were updated countless times before 2003, when Pioneer 10 finally fell silent due to power constraints of the plutonium-based radioisotope thermoelectric generators, near the outskirts of our solar system. This was some 20 years after their planned lifetime.

The telemetry, radiation and numerous other sensor data amounted to a total of 40 gigabytes for both Pioneers, a formidable amount to be stored on 800 cpi tapes of late 1970s and even on 6250 cpi ones in the early 1990s. An 800 cpi full size tape reel contains a maximum of 5 megabytes.

NASA had no obligation to store 'secondary' data like telemetry, but fortunately one of the original systems engineers Larry Kellogg converted the tapes to new formats every now and then. Thanks to him scientists are still making new discoveries based on the raw Pioneer data. It is also of exceptional value to have it in raw format, as more and more advanced algorithms can be applied.

Today's embedded but cloud-connected environments have a lot to learn from Pioneers' engineering excellence and endurance planning. We just briefly forgot it when it seemed so easy to solve all storage and computing problems with local, power-hungry disks and CPUs.

Pioneer H, a never-launched sister of 10 and 11. Courtesy of NASA

//Pasi

16.10.2013 - Primordial Sea of Data

Hype has it that most organizations must start collecting and storing vast amounts of data from their existing and new systems, in order to stay competitive. In the past this was plainly called data warehousing, but now all major consultants and analysts rave about Big Data.

So what has changed? Apart from IT marketing, not much. Existing databases have been beefed up with enormous hardware and some fresh innovations (like Apache Hadoop) have arrived to try to overcome the limitations of legacy SQL behemoths.

But a database is just a database. Without algorithms and filters analyzing the data, it is just like the primordial sea before any life forms appeared.

In the coming years, nearly everything around us will be generating Big Data, so collecting it to any single location or database will be increasingly hard and eventually impossible. Databases, or the sea of data around us, will be highly distributed and mobile.

When reality soon eclipses the hype in Big Data, it'll be those algorithms and filters, and their evolving combinations which carry the largest value in any organization.

Those entities, which we call Spimes, need a scalable, fault tolerant and highly distributed home. It is also known as the BaseN Platform.

Rising data sea levels

//Pasi

16.12.2013 - Platform Socks

Having EU size 48 (US 13) feet causes some inconvenience as the selection of shoes and especially socks is quite small. Usually the largest sock size is 46-48, meaning that they barely fit and thus wear out relatively fast, probably because the fabric stretches to its maximum.

Socks have had a very similar user interface and sizing for centuries. There are a few providers for custom made socks, but they offer steep prices and still start with an assumption that all feet have similar dimensions.

Looking from our beloved Spime (and my) perspective, socks should quickly be spimified so that each new pair would fit slightly better. They should also be reinforced from the spots the previous pair experienced most wear.

In order to sense wear and stretch, a sock might have a simple RFID tag connecting tiny conducting threads woven throughout its fabric. Each time the tag would be read, it would transmit a 3D map of the sock based on the matrix of broken and still conducting threads, to a reader device in e.g. a mobile phone capable of sending the data further.

As most RFID tags are now mass printed, the additional cost of the sock tag would be hardly more than 15 eurocents, so this would be feasible for most sock manufacturers.

Now add BaseN Platform to host those millions of sock spimes (3D maps, current and historical) and my desired sock service is ready. I would subscribe right away if I could get new, better fitting socks mailed to me just before the old ones start breaking up.

//Pasi

16.04.2013 - Smart Slum

In most energy efficiency and Smart Grid projects and pilots, we see shiny new buildings constructed for wealthy inhabitants, who drive and charge their hybrid SUVs and golf carts using state-of-the-art solar panels on the roof of the building.

This is all nice and convenient, but does not stand for closer scrutiny when it comes to total energy efficiency and carbon footprint of these measures. The small scale solar panels, wind turbines and more complicated construction methods easily generate more carbon dioxide upon their manufacturing than what the building saves during its typical lifetime.

If we're really going to increase the societal energy efficiency, technologies must be designed to scale for millions of people, starting from the lowest income classes. Those smallest student and single parent rental flats should be on the top of the list to reap benefits from smart meters, demand response, time-of-use energy pricing and other services offered by quasi-monopoly utilities.

This is more than doable, given that state and municipal owners of these utilities take some action instead of just collecting nice dividends each year.

Smart and glossy is good but not enough (Image by Skanska)

//Pasi

19.11.2013 - Missing My Code

At 12 I built a rudimentary, 4-relay controller which I attached to my Commodore 64's user port. This allowed me to create irritating light shows with colored bulbs on the relays, in addition to controlling volume and channel selections on a half-dismantled stereo set. Triggering these at times when I was away from home was highly enjoyable - from my point of view, at least.

I was always fascinated by radio controlled (RC) aeroplanes and helicopters, but due to their high prices I was only able to negotiate a couple of simple toy-grade RC cars. Controlling a car wirelessly with a dedicated controller was a fun thought, but turned out to be boring in a few days.

When the other RC car's motor let out the holy smoke (as I had installed an additional, way too powerful battery pack) I was left with one working car and a bunch of spare parts, including an additional radio control set.

Having seen Star Wars, I wanted a thing that could control itself (Yes, I really like(d) R2D2), so I attached the radio set to the relay controller of the C64. Four relays were just enough for forward/back and left/right commands. This initially enabled me to drive the car from the C64 keyboard and save and replay its routes. Cool, but not enough.

Installing the second radio transmitter to the car enabled me to add front and rear collision sensors, made from bent copper. The receiver was connected to the C64 joystick port, as it was easy and fast to read in software.

The end result was a car that was capable of mapping a room and avoiding obstacles by itself. I coded for weeks to make the thing as autonomous as possible, within the constraints of 64 kilobyte memory. Looking at the car I felt it was 'thinking', as the 1 MHz processor took quite some time to iterate coordinates in memory.

It was the coolest thing I had built that far. I do have a few cassettes and 160 kilobyte floppy disks remaining, but I doubt those are readable any longer so the software is probably lost forever. Now 30 years later I'd like to understand my thinking back then.

That software, or the essence of its algorithms, could now be run in the BaseN Platform, with access to terabytes of memory and thousands of processors. It would be the car's spime. And I would make it way cooler that R2D2 ever.

//Pasi

21.02.2013 - Measurable Empathy

When BaseN hires people to any position, half of the interview is always dedicated to an ad-hoc role play where the applicant is presented a scenario with components from past BaseN endeavors. All interviews include two BaseN people, or more depending on the scenario.

Compared to traditional interviews we used to conduct a few years ago, we've concluded that the scenario method yields far more information about the applicants. Most surprising are the otherwise promising candidates who just decline to participate, citing that they would have needed time to prepare. But.. in most of our positions, tasks must be faced without a period of preparation, using available skills. Such an interview ends there.

So what does our scenario model actually measure? After a few tens of interviews and a lot of thinking, I believe that the answer is empathy. People who excel in these can be outgoing or shy, independent or collegial - very different people indeed.

My conclusion is that in our kind of dynamic workplace, empathy and along with it the ability to perform mental scenarios without own prejudices are by far the most important skills people must possess. They will enable people to continuously develop and quickly adapt to new situations, while maintaining a curious mind.

In other words, dreaming is allowed and encouraged - provided that it involves the BaseN Platform during working hours.

Because we're no robots at BaseN: Empathy matters!

//Pasi

24.09.2013 - Welcome to the Spime Farm

During the last couple of years, BaseN Platform has been used to monitor and control an increasing amount of nodes, or devices, outside the traditional telecom and IT realm. As a consequence, we've vastly developed our capabilities of securely hosting complex algorithms which analyze - and, perhaps even more importantly, control things ranging from solar inverters to rat traps.

Fast-forward a few product development cycles of our customer. What do we actually host? The rat trap may send us images, temperature, humidity and olfactory sensor data, while we (the Platform) can issue the lethal blow if, and only if exactly the correct species of rat (and definitely not the rare red crested tree rat) enters the trap.

The trap itself is, actually, an algorithm within our Platform having (somewhat cruel) physical extensions in form of a killing spring and an array of sensors. This kind of physically augmented virtual entity is called Spime, a term coined by author Bruce Sterling in 2004. The point is to emphasize the model over the manifestation.

Spimes, which record and manage the full lifecycle of their physical representations, enable immense efficiency improvements when combined with recycling and 3D printing technologies.

We believe that the role of hardware and software will intermingle toward the spime ideal, and there will be a need to manage untold numbers of variegated spimes-in-the-wild. This is the direction we're heading towards, empowering our customers to devise their own, evolving algorithms for their own spimes. The first step on this road is my.basen, which is our first service that can be fully activated and operated via the web. Customers register, log in, start sending data and create algorithms and actions. All as a service.

//Pasi