Monday, May 28, 2007

So Long, Farewell, auf Wiedersehen, Goodnight

Well it is time to close another chapter on my MBA course. When this all started 6 weeks ago, I was thinking “what style and substance can I add to this?” I’m sure folks found it a bit of a drag to write on topics related to IT, but I didn’t really have a problem coming up with topics. The very first blog was rather intuitive to write, considering I have worked in the IT industry for the last 10 years. However I soon realized that there was a lot more to the IT industry than what I have experienced as a mainframe programmer and my latest career incarnation, a business analyst.

The course touched on a lot of things I am familiar with (databases, transaction processing, enterprise systems, etc.) and some of the latest technologies I was not so familiar with (wi-fi, IEEE standards). But it also got me into blogging. I had always thought that this was for folks who didn’t have much to say, or didn’t bother writing to “letters to the editor” of their local print media. But I have come to find out, as mentioned in one of my articles, blogging is now part of the cultural landscape, moreso than I expected.

On my first article on software flaws, I got a response for a gent whom I never met, nor was a part of my class, or worked in my profession. I was a little wigged out at first, but then it dawned on me that this is what it’s all about. Voicing an opinion, sharing information, no matter how subjective the topic is. And after the dot-com boom, I thought that there would be eons before a feasible company would make it online again, but then Google proved me wrong. So thanks for introducing me to Web 2.0, and bringing me up to speed technologically with most teenagers without having to watch MTV. But given the rate at which technology changes, I'm sure by the time I finish this program, I'll be assumed technologically illiterate like this poor lad in the following clip.


Friday, May 25, 2007

Getting a Decent Return on Your IT Investment

So as a CIO, you’ve just spent a million dollars to implement a new communications system, only to find out that it will be outdated in 6 months. Or, you want to pay $300 for Vista, only to know that the next OS from Windows will be released in the near future. Because of the increasing power of computing (Moore’s Law, etc.), today’s Cadillac software is tomorrow’s Pinto. In fact, South Korean kids now think email is outdated, and are not using it, favouring instant messaging technologies. So the question must be asked “Why Update if the bare minimum standard of technology is a constantly moving target?”

The answer is simple “to improve a facet of your business, whether it makes or saves money”. The 1 million dollar communication system may be outdated in terms of the latest product on the market, but did it solve any of the businesses’ problems and improve the bottom line of the company? Was it better than the old system in creating cost savings, or increasing productivity? If the answer to questions like this is yes, then the investment has generated a return, and done its job. The technology need not be the fastest, most powerful, best in its class in order to be implemented. If it keeps in line with the strategic goals of the company, any IT investment can be a good one. And as long as it wasn’t training seminars stored on laser discs.

Tuesday, May 22, 2007

The Magic of IT Empowerment?

As mentioned in my previous article about IT empowerment, the worker is relived that there is potential for less direct management. However, there can be a flipside to that. Management is also relieved that there is potential for a decrease in payroll expenses.

Just as that Vancouver lawyer can communicate in real time with his Toronto counterpart, he can communicate with his even newer counterpart in Bangalore just as easily. Offshoring and outsourcing are by-products of improved information and communications technology. Obviously, this has greatly impacted the call-centre industry at first, but now, complex software systems once developed and maintained in the western world are now being built in developing nations, primarily India. In fact, it is estimated that India is the number one destination for IT outsourcing. The country has coupled modern information and communications technology with a skilled, English-speaking workforce with a lower cost of living than all western nations, to grow their information technology sector. Other countries, such as Ireland, China, Russia, and in some cases Canada, are following suit.

So it appears that this technology can either empower a worker to do his job more effectively, or move it to a low-cost center all together. Obviously, there have been many drawbacks to outsourcing on both sides of the ocean, too many to get into here.




The Magic of IT Empowerment

We have all read countless case studies in which the manager in some stuffy mom and pop machine shop buys his staff a high-speed internet connection, a wireless hub, and 10 laptops, and transforms it into some lean, mean globalization machine with an Assets Turnover Ratio launching right off the financial statements into the stratosphere. We get the point. IT infrastructure has the uncanny ability to enhance productivity and efficiency of today’s worker. The degree to how much productivity is enhanced depends on the job performed of course (a spreadsheet in the possession of an accountant is much more effective than a spreadsheet in the possession of a brick layer). However, it can be said that today’s employee has the ability to process information more rapidly, and communicate and collaborate with a greater breadth of people and depth of information.

Consider the following scenarios: A lawyer in a Vancouver office works with a co-worker in Toronto on a fraud case using NetMeeting, Instant Messenger, and other instant communication tools; these very same tools are used by an introverted software programmer to communicate with a gregarious tester sitting in the adjacent cube, who normally resents the programmer’s lack of verbal communication.

In both scenarios, the following has occurred:

1. Barriers to communications, both physical and sociological, have come down.
2. It seems that distance is not much of a problem, as long as the infrastructure is in place to allow seamless communication. However, as in the first case, time difference can be.

Due to this, organizations can flatten out, cutting costs of bureaucratic channels of communication and oversight between groups. Finally, the worker can make informed decisions regarding colleagues, customers, supervisors, and any other stakeholders involved in their daily activities. Empowering workers can lead to business success.

If it’s Too Good to be True, Then There Probably Isn’t a Nigerian Official Being Detained

Cybercrime has proliferated with the Internet. One of the first waves of cyber crime originates in Nigeria, coming in the form of scam letters. There are many examples of this, but it has been so prevalent, these letters are often referred to the 419 scam, after the country code of Nigeria.

Over time, the scammers have become more sophisticated, with phishing scams and lottery number scams becoming the ‘fraud of choice’ for those folks who operate less than honestly. In the wireless world, a major concern is that these fraudsters can spam you not only on your PC, but on handled devices as well, so there is little reprise from them. And fraud is just the tip of the iceberg, not to mention the other crimes including identity theft, sexual exploitation, and blackmail.

To combat this, police forces worldwide have created cyber-crime task forces to deal with this. However, these criminals across legal jurisdictions, which makes law enforcement that much trickier.

Delegates from the Asia-Pacific Economic Cooperation (APEC) group have recognized this problem as a threat to not only to the citizens of their countries, but as a threat to international trade and economic well-being. They agreed that all APEC economies need to develop legal frameworks that include laws and policies that do the following:

1. Criminalize conduct such as gaining unauthorized access to computer systems and causing damage to computer systems.
2. Allow law enforcement authorities to collect electronic evidence.
3. Allow economies to cooperate with each other in investigating and prosecuting cybercrime.

The European Union is also looking to cross-jurisdictional solutions in dealing with cybercrime as well. Because this is a new type of crime, new approaches must be taken to deal with it. The three points above are just a start. As the internet becomes embedded in our civilization, the harm that it can cause must be mitigated though the legal systems of all nations on the planet.

Monday, May 21, 2007

RFID - Retiring the Barcode, or a Potential Invasion of Privacy?

Radio-frequency identification (RFID) is an automatic identification method, relying on storing and remotely retrieving data using devices called RFID tags or transponders. An RFID tag is an object that can be attached to an object for the purpose of identification using radio waves. All RFID tags contain at least two parts:
1. An integrated circuit for storing and processing information, modulating and demodulating a radio frequency (RF) signal and perhaps other specialized functions.
2. An antenna for receiving and transmitting the signal.


In 2005, more than 1.3 billion RFID tags were produced. By 2010, 33 billion tags will be manufactured according to a report by In-Stat. From a supply chain perspective, most of this is driven by Wal-Mart, who has mandated that its top 100 (and, later its top 300) suppliers begin to use RFID." This will eventually replace the barcode reader as a means of identifying merchandise.


However, this has raised the ire of many privacy advocates. Plans to put RFID in US Passports have met with resistance , as a person could unwillingly broadcast all their personal information (age, sex, date of birth, etc.), and could lead to a rash of identity theft. So until the wireless security is fool-proof, maybe it is a good idea to stick to tagging clothes on a rack before getting around to the person who wears them. Then and only then can it be used for things such as keeping away any "undesirables" (such as the ones seen below) from a particular establishment.


Saturday, May 19, 2007

Are Blogs Legitimate News Sources?

Traditionally, news has always been broadcasted by a large entity with armies of professional journalists. These entities can come in the form of corporations such as NBC, CBS, ABC, CTVGlobeMedia, or public/not-for-profit organizations, such as CBC or PBS. Now, with the internet changing how information is exchanged from a centralized to decentralized information source, we see the advent of blogs influencing mainstream media. The first blog that influenced mass news as we know it is the Drudge Report. This was the blog that broke the story of the Bill Clinton/Monica Lewinsky affair to the world before any of the major media outlets got wind of the story.



Even though major news networks, such as CNN, have integrated blogging as a part of their news delivery, they have also argued that there are no adequate checks and balances when it comes to “breaking news” that originates from blogs. However, a blogger by the name of Charles Johnson also proved the converse true. In 2004, Johnson, who operates the Little Green Footballs blog, helped prove beyond a reasonable doubt that the documents raising questions about President Bush's National Guard service, presented by Dan Rather on CBS' "60 Minutes" were fakes. Many agree that this ultimately led to the resignation of Rather, the star anchor of the network.

So it looks like blogging is here to stay, and slowly it is becoming a legitmiate source of news. But like anything, one always has to take the credibility of the source in mind, whether it is a major broadcaster, or some guy sitting home in his den in a housecoat.

Sunday, May 13, 2007

Second Life – Is This For Real?


For those who are not familiar, an American company called Linden Labs has developed an on-line virtual world called Second Life, where an individual can log on, and create a character, called an avatar. It provides social networking in real-time, and unlike most online games, such as Warcraft, there is no particular purpose, except interact with other ‘avatars’. Here is a YouTube clip describing some of the features of this virtual world.



Obviously, it must be a great escape for reality for a little while, but after seeing that stores such as American Apparel, Sears, and Circuit City are already selling their wares in this world for real US currency (as mentioned in the YouTube clip, Second Life currency is Linden Dollars, which can be purchased on the Linden Currency Exchange or LinedeX.) it appears that the line that separates this virtual world form the non-virtual world is getting blurrier.

This on-line society, complete with a functioning mass-market economy, rock-concerts, religious organizations, and its own pitfalls such as crime, and political campaigns, doesn’t sound like a place one can escape the banalities or stresses of the non-virtual world. In fact, it seems like it is getting more and more “real” as it evolves. So to escape this, do avatars go have a beer, read a book, or blow off some steam at the gym? If they do, then you got to wonder “why would anyone want to be here?” Maybe I can start a place where avatars can have their own “people”, and meet for things like “dinner”, “a drink”, “conversation”, or “dance” in analog. I’m sure that it would be a pretty interesting place, although it wouldn’t accept Linden Dollars :) .

Tuesday, May 8, 2007

Newfoundland’s Contribution to Early Wireless Communications

When we think of wireless communications, we think of cell phones, PDAs, and other handheld devices that allow us to roam freely while tormenting those around us in supermarkets, movie theatres, and on public transit. Wireless technology is a facet of modern-day life. But have you ever stopped and wondered how it all came about?

I’m sure from reading the title, many folks would probably think that Newfoundland’s contribution to the wireless communications is a paper plane with a note written on it. I invite those people to read the following brief, before getting on with their daily routine, such as reading Margaret Wente, or sticking their head back in their a**.

Anyway, in 1901, an Italian, Guglielmo Marconi, believed that radio waves could travel with the curvature of the earth. This went against the accepted theory of the day, which believed that radio waves traveled in a straight line from their point of origin into outer space. By proving otherwise, Marconi could demonstrate that communicating with offshore ships was possible. In essence, he would introduce wireless telegraphy to the world. He did not invent wireless communications, as scientists such as Tesla, Hertz, and Lodge made important discoveries in radio communications leading up to this. However, he is regarded as the individual who took wireless telegraphy out of the lab and into the general public for its use.

What does this have to do with Newfoundland? The point he selected to receive that wireless transmission was Signal Hill, in St. John’s (the transmission was sent from Cornwall, England). Given its location as the most easterly point in North America, it made geographical sense. Plus the original site in Cape Cod, Massachusetts was destroyed in a storm.

For more details on how the experiment transpired, click on the following link. Or for the movie version, click here. Below is a picture of Marconi (left) and his assistants getting ready for the wireless experiment.





Tuesday, May 1, 2007

Why Software is Never Flawless

A civil engineer once told me that if bridges, roads and buildings were constructed with the same level of quality and integrity as a software program, society would cease to function after the first heavy windstorm. I have often wondered why we tolerate the abysmal record of software failures and glitches, given the fact how information technology is the engine of commerce, backbone of civilization, and all that jazz. But seriously, for those of us in the business, whether toiling in a back office sweatshop batch processing bank statements, or designing the next great killer app that will change the way information is processed forever (read 3 to 5 years), failure is quite common; more common than in other disciplines. Even though there has been an improvement in the number of successful software projects due to improved processes, skill sets, and management, there are still two main causes for the demise of a software project:

The software does not do what it is intended to do – missed or misinterpreted requirements;
The software cannot do what it is intended to do – shoddy or rushed development

Imagine this discussion between two architects:

Architect 1: Hey Bob, how is the new museum construction down in Florida coming along?
Architect 2: It’s going ok Jane. Although the glass ceiling we installed can only withstand a 15 km/hr wind. Anything above that, the whole shebang will collapse inward.
Architect 1: Jesus! And you guys are still going ahead with this?
Architect 2: Yeah, the timelines were tight but we have a contingency plan in place. If it gets too windy, we’ll move the dinosaur bones and the Faberge Egg exhibit to the basement. If the ceiling does come down, the museum will refund admissions for the day. Then we’ll come in and patch up the ceiling with a new stronger glass that we wanted to put in to begin with, but couldn’t because we didn’t have the time or money. Plus the insurance will cover the costs of installation and materials.
Architect 1: Wow, you think of everything!

Or between a plumber and his customer:

Plumber: Ok, I have your toilet installed.
Customer: There is no water in the bowl. How come the tank doesn’t have any flush mechanisms? It’s empty!
Plumber: Oh, you mean like the float ball? That’s part of the upgrades to Toilet 2.0. It will be released next quarter. But you’ll get it for a discount when it comes out.
Customer: Alright, sounds good.

Obviously these scenarios are totally ludicrous. Yet, when it comes to many software systems, this is commonplace. Decisions are made to implement erroneous software, and mitigate the fallout after. Why is this?

Cost of developing a quality software product can be quite high. Unlike hardware, or any other physical entity, one can never test all potential software defects, and therefore software is often shipped to the end customer, bugs and all. The activities that are required for good software QA are listed in the following article. http://www.badsoftware.com/plaintif.htm. As one can imagine the costs add up, which leads to my second point….

Why bother? Often there is a more lax attitude towards software quality than hardware quality because of the reason listed above. The costs to produce the perfect software could lead to release delays, or software that is too expensive for the intended market. Decisions are made to go with a less-than-optimal software application in the cases where there is little or no legal liability, or consumer backlash for doing so. So until it is demanded by a court of law, or by Adam Smith’s invisible hand (which could slap a buggy application like it was one of the Three Stooges), the failure of application software to do exactly what it is proposed to do in an effective manner will be a fact of life, unfortunately.