Pages

Wednesday 5 December 2012

In The News 1


As a Technology reporter, you get to be at the forefront of the Tech Industry. In my case, I started with IT and then moved on to serious R&D science reporting.

Reporting IT during the Dot-com Era was exhilarating: There would be something new to write about every week. The press releases we received piled up as thick as a loaf of bread!

Back then, the IT industry was as a whole (i.e. on a worldwide level) nascent: hardware, software, applications, etc, - all were new, mostly driven by new PC/server releases and internetworking products. There was LAN then WAN. Conference meetings on the PC platform went from audio to video. Video feeds evolved from analog to digital. The whole PC platform got hijacked into a mobile phone. And thus was born the smart phone and 3G network.

In a gist, that was what happened. But, of course, the Dotcom Era was more than that. It was a time of intense competition. The reason: everything was new. Products, as well as network speeds.

Ethernet crawled from 10MHz to 100MHz to 1GigaHz beating Asynchronous Transfer Mode or ATM to the desktop. It was cheap and legacy friendly. But ATM's 52-cell data stream format was later preferred serial streaming for fiber-optic WANs. Remember all the many ATM Forum meetings and conferences we used to have? The wonderful thing about a cell-like data stream is that it can weave its way through various router paths and yet arrive at the same destination. Mobile comms data follow the same schema. But the 'cell' term in mobile telecoms also refers to how an area is divided into a mesh of honeycomb cell-like transceiving areas. So, please don't confuse between the two!

When I first started reporting IT, a PC running on a 80486 DX4 processor was considered high-end. The toaster oven-shaped Macintosh was competing for desktop space in homes and editorial offices, not so much for business. Remember Apple's Lisa business computer? It failed miserably.

It's no wonder because Steve Jobs never felt comfortable in a business suit. He and his products were just not that sort. He much preferred the "smart casual"!

Having both Apple and Windows products in the office in that era was a pain in the ass. Each required a separate IT network to link up. Interoperability was still the Holy Grail. In the 80s and early 90s, there was the Apple vs PC debate; but it applied only to the personal PC arena.

Besides each spotting a different OS and physical look, inside, there was also hardware battle. At the time Motorola (the semiconductor giant) was aligned with Apple. Intel sided with Bill Gates and Windows. It was 68000 architecture vs 80x86. As training engineers in the 80s, we learned to deploy with both - the schools did not want to take sides. We also learnt something better: microcontrollers. These were touted as the processors of the future for Control Systems, stuff that controlled everything from intelligent buildings to washing machines. Back then, each microcontroller needed supporting role chips such as a digital to analog converter, I/O ports, timers, EEPROMs, etc. These days, you'll find everything in one solid integrated package.

After the 486 came the Pentium ones from iteration 'I' to 'V'. In terms of Windows, it was from version 3.0 to 95 to XP in 2001. That year, I was invited to the Microsoft Redmond head-office to witness the launch. I was, as mentioned in a previous blog, invited to be part of a press junket consisting of prominent journalists from Asia. I was executive editor of two prominent PC magazines then, one dealing with Windows user problems....so naturally, I was on the PR media 'Priority List'.

I was excited, of course, to go to Microsoft's campus in Seattle. Anybody would, even if you were entering 'Borg' territory. You know that joke about 'Resistance being futile, and that all will be assimilated'? Yes, Microsoft was huge and intimidating back then. Given its rivalry with Apple, it had that "it's either me or them" mentality.

Apple in 2001 was still largely a niche player. Their fast 'G' series of Power Mac computers (with their good displays and slightly more powerful processors) were mainly used by the advertising and graphic design agencies. But by then the PC makers were already introducing new architectures and speedy motherboards (especially with faster front-side buses) and improved graphics cards. It was around that time when graphics cards started sporting their own GPU or graphics processing units with their own memory banks. This offloaded a lot of work from the main CPU, freeing it to do more important computational tasks.

The publishing company I worked for used Windows PCs instead of Macs to design and layout their magazines. It was rather unusual but they weren't disadvantaged at all. Another much larger publishing house was doing the same.

In fact, the advantage to using a PC has always been that it is a Commodity Product; an off-the-shelf purchase that can be bought anywhere with components mixed and matched. Unlike Apple, which was proprietary. That's one thing I couldn't stand about them. Buying an Apple printer meant connecting only to an Apple machine. What a waste!

As a practising engineer before, we seldom thought of using an Apple machine. With a DOS-based PC, it was so easy to design an interface system or write a driver. You just had to know C++. And you could literally 'plug & play' - why so many equipment manufacturers of oscilloscopes, signal generators and digital analysers were on the PC-Windows bandwagon. The learning curve to deploy them was so much shorter. It was economical too. Imagine having to buy all your PC paraphernalia from just one Apple company. They wouldn't be able to cope with all that hardware and driver demand as well as upgrades and partner issues.

Till this day, the Windows PC remains the choice of engineers everywhere. They even have one that runs on Ubuntu Linux, what with the programming community so strong on that one. The introduction of Android has further increased the community's expertise tremendously.

Till today, Apple's PCs remain undeployable to engineers as an interface machine. (1) They look too pretty. (2) They are not intended for that use. Unless Apple suddenly come out with something like the Raspberry Pi (a bare bones computing unit for rural projects and hobbyists) I don't see the situation changing. Or if they create something more professional. It will never happen with Jobs...Wozniak perhaps.

But what Apple did right was to come up with certain products that were easy and friendly to use, such as the iPhone and iPad. These portable machines have encouraged engineers and hackers to crack into their OS shells to let them do stuff their OS did not intend them to. In Android, this is called "rooting". In iOS (Apple's OS), the same is called "jailbreaking". Rooting can allow your Android smartphone to be used as a Wi-Fi tether. Jailbreaking lets you manipulate an iPhone's icons and menu display... among other things. All you have to do is look around and fiddle. Or join a tech forum for tips.

You might wonder why the popularity of the iPad and Apple's laptops has not translated into bigger market share for the company. It's because the Windows PC world is much bigger than what we see on our desk and laps. Discounting the corporate market, it is due to the use of the Windows PC in other areas. Specifically, a huge market exists for the "industrial PC".

Not sure what that is? Well, it is basically a PC made to fit an industrial use. As such, these machines come in all shapes and sizes. In technical-speak, that's "form factor".

Say you need a slim PC to fit into an equipment rack. No problem. Someone can offer you one that is 2U in size ('U' being a unit shelf of 1.75 inches in gap). How about one that can be used in rugged conditions such as rain and snow? No worries, companies like Grid Computers have been making laptops that folks could bring with them to the desert or to wars. Such laptops can be dropped from a metre height and still function. Or slid on the floor and be stamped on. The present-day Panasonic Tough Book is an advanced example. You would probably have seen their funny (and impressive) ad featuring an elephant and a monkey.

A more common example of an industrial PC is that found in a car park payment machine. That PC runs the currency reader, the LCD display and the ticket reader. It is also connected via LAN to a server and via RS-232 to a barrier gate control system at the entry and exit points. It is a busy little PC (no pun intended) often tasked to run under very hot conditions. If you think your PC is puffing hot air on your desk, imagine it being cooped up in a walled box with few air slits. It is the reason why your change from the fare machine would always feel warm each time, ditto the $1 coin refunded from an SMRT ticket machine. At times, the returned coin can be too hot to handle!

Other industrial scenarios where special form factor PCs are used include the inside of a petrochemical plant, an aircraft or even a battleship. Many point of sales (POS) cashier machines now are PCs running on Windows. It's easier to hook them up via Wi-Fi to a backend wireless server.

Why are there so many different PC form factors?

A PC is, after all, just a collection of chips and I/O ports and connected devices. They can take any shape and form. Not necessarily to be put into a rectangular box (desktop) or one that looks like a pizza flat-box (e.g. laptop). In the past, much was decided by the inflexibility of printed circuit boards or PCBs. But since then, these PCBs have become multi-layered like kueh lapis but still much thinner. Different PCB materials let you do that.

My favorite PC has to be the one that comes on a backplane. A slim one no thicker than two inches that slots into a rack (what you see in most spaceships in a sci-fi movie especially when the hero has to enable or disable something behind an instrument panel). There are those that run not on hard memory but flash ones instead. The Compact Flash memory card format (remember this one?) was popular back then as they were the only format capable of providing high and speedy memory storage. But I am sure they would have graduated to the smaller SD cards by now, if not the micro-SDs.

Imagine a computer whose OS and start-up software all run from memory cards. There's no need for a hard drive even, or PCB-mounted memory chips. Industrial PC makers have been doing that since 10 years ago; tablet PC makers are only just doing that recently. To be fair, it had all to do with cost. A consumer is only willing to pay so much for a retail product. With industrial PCs, user companies don't mind paying extra for custom features as long as it does its job well in the stipulated environmental conditions. In such cases, the higher quality components do cost more to produce, test and qualify (like the MIL spec).

Also, SSD (solid state device) memories have come a long way in terms of capacity and cost. It used to be a dollar a MB. Now it is $2.5 for a GB. How times have changed! I think of the 21MB Hitachi disk drive in my 1985 XT PC and laugh. But it was such a solid disk drive that I am inclined to put it on an alter to be worshipped. It withstood a major electrostatic shock and kept on working. Many would have expired at first spark. (I was at the time building a PC interface card to trigger a weather warning system.)

What does having a drive-less PC mean? Well, you can store all your programs on a smart phone and have it connect to such a PC (like a drive-less backplane computer, for example) and run apps off it (via a micro-B USB port). It's a hacker's dream. A scenario like this was played out in Aliens (1986, the second Alien movie), when the robot Bishop had to use a handheld computer to hack into the facility's IT network to call up a spacecraft to bring him, Ripley, and the young girl home. A computer with a hard disk cannot be small, like the trendsetting HP 95LX palmtop.

One of the more interesting PCs I've seen ran as an emulator machine. It's no big deal now but 11 years ago, it was. That particular emulator machine (from Celoxica) allowed software from old game machines to run off it, old game machines like the Commodore 64 and Atari. But that PC was actually built to run complex 3-D biological simulations to aid drug discovery. Emulating old games was to demonstrate a point.

(And I've seen an old BBC Micro 8-bit computer being used to control a one-legged hopping robot in the early 00s. British engineers are nostalgic that way. It proved that you didn't need much in terms of hardware to accomplish something very complex.)

Do you know that there's a free Nintendo DSi emulator floating around on the WWW? Download it and you can play all the DSi games on the PC. It does expand the gameability of a netbook (one that is dual-core at least). And there are many free downloads of Nintendo's DS games online (one popular site in particular). I don't encourage piracy but the prices Nintendo charge per game at the shops is just plain ridiculous. $45-$75 per single game card? No wonder pirate game cards that use micro-SD memory cards to store games exist. With these 'R4' or 'R4i' cards, you can store multiple games limited only by memory card capacity. A typical one requires only 4GB and it can store some 80-90 games depending on game file size. The solution is a godsend for parents wanting to save money on games. Let's face it: there are only a handful of engrossing titles out of a few thousand. Are you going to spend a fortune to find out which one? Game reviews offer limited help.

But not all Nintendo games are for children. If you feel you are getting senile, I suggest playing one of Nintendo's many puzzle games for adults. Brain Age is one, Soduku is another.

Probably one of the most earth-shaking bit of news in the IT industry in the last decade (no, not Jobs dying) has to be the switch of Apple hardware from Motorola to Intel. It's like Richard Dawkins suddenly embracing religion; it was that weird. But I think the winners in this were Apple and the PC user. No longer do we PC users have to worry about interoperability issues any more. Microsoft Office files can finally be easily interchanged. That's quite remarkable, isn't it? And with greater general acceptance, Apple went on to sell more Home and Office computers. Apple became the bling PC to have in the office. Quite a change in fortunes for this little company, no? I mean who could have imagined the proverbial Montaguts and Capulets getting along like that?

With speedy PC machines these days (i.e. dual/quad/dual-quad cores), software emulators perform just as well as hardware based ones. They used to lag behind so much, especially when 3-D graphics are involved. All those vector computations can really slow an emulator program down.

Speaking of multi-core PCs, they really killed the market for parallel processing ones. I remember a local company in Singapore used to do that. It was run by a Taiwanese physicist doctorate. They built their own parallel processing boards out of Intel processors. They sort of piggyback on one another and were intended for the server market where computing speeds (in MIPs) mattered most.

Of course, nothing prepared me for the PCs in Japan. This was before multi-language capability (in an OS like XP) was introduced to a desktop PC. In Japan then, they wrote their own OS and all their PCs worked in Japanese. NEC was a big player then, like how IBM was the same in the US. They created mainframes, client stations and also desktop PCs.

It took the Japanese many years to switch to Windows and be more compatible with the rest of the world. The multi-language functionality of XP really helped the World Wide Web to be truly international. All of a sudden, we could all type in a foreign language without having to download a driver or buy some special keyboard.

With Google Translate now, we have certainly come a long way. Ten, twelve years? It's been a blink of an eye compared to the way other new stuff get introduced, such as pharmaceutical drugs for example. Their introduction can take decades.

But OS technologies have come a long way. Just look at Android. Since 2010, four or five new versions have been introduced and it's all open source! Expect more innovations to come in the next couple of years. (I knew a guy once who wrote an OS for a local shipboard communications system. That was in the early to mid 90s. Singapore do have the talent in this software area.)

And which technology will next make a big impact?

Well, I cannot wait for the day when speech recognition in a machine becomes natural. Things will happen in a split second or as fast as you can speak. Or as fast as the machine can infer what you actually mean. Hearing is not the key; inference is. And that has been the harbinger to speech recognition. But hey, didn't they used to say the same about video? These days cameras can discern the 'intentions' of people they film... to ascertain whether they are friendly or hostile.

It's important because flying drones in the battlefield need to know if they should shoot at you or simply ignore. I've seen flying drones that are like small helicopters equipped with machine guns. Run like hell or carry a 'friendly' mask to wear. One of Bill Clinton maybe. Ha ha.

But really, cameras like these are already deployed in London to pick out rabblerousers and hooligans in crowded places.

We are actually farther into the future than most people can imagine. And it is always the space or military folks that lead the way. Folks with the money and wherewithal to explore and devise; usually not having to contend with commercial safety standards and ethical use. A prime example is "directed energy weapons" or DEWs. They can microwave-fry you from a distance. Ah, technology. After so many years, it is still fun to report about. But I will stand far far away, thank you.

Afternote: In 2002, I brought a start-up company to the CeBIT IT fair in Hannover, Germany. We went there as an EDB contingent of promising companies. In this group was a company called Muvee. They claimed to have invented a software that could create an edited wedding video from a long one, with music accompaniment no less. They showed how it was done and was quite amazing. Imagine shooting a wedding and leaving a computer to do all that editing work!

The next story: Once Upon A Spider

Inside a ticketing machine.

A program that reads faces and moods and 
creates an edited wedding video from a long one. 
With music! (A Singaporean invention/product)

No comments:

Post a Comment