Pages

Testing Lead to Preventive Measure

Why testing is important for any product? Everybody knows about it but ignores it intentionally, especially when change is minor. Normally, people assume that for small changes why we should spend so much time on testing. The reason is, we are using so many tools and technologies available in the market without even knowing 20% of its usage and implication. Do we really aware of these tools being passed all the testing hurdles? Do we really understand the limitations? Do we understand internal complexity of it? Do we analyze if it would be fit for our project? Even though, if, it is known to the architect/designer, is this knowledge pass down to the developer level? The answer is NO!!!!!

That is reason why I would suggest having more review & testing time than earlier.

Testing team should be completely different from development team and directly reporting to client. That way, client can ensure correctness and completeness of development. It is a difficult thing to convince service industry, but, client should make themselves aware of these facts.

Software testing is to ensure the business and technical requirement is being met based on the test data and controlled as well as uncontrolled operating conditions.

Software testing is a process of creating test cases based on requirement (both business & technical) and run it under conditions to see the expected results. The result in software testing is either pass or fail. You can compare this to your exams!!!!! It is the examination time for development team.

There various methods that can be used for software product. But, the methods should be chosen based on following criteria: 1. Type of project/product (Development, Maintenance, Support,) 2. Size of the project/product 3. Type of talent in the team Testing is a major contributor in deciding on preventive action to be taken.

Companies normally put good amount of money on quality, six sigma, CMM assessment but one thing that they forget is all these processes will work perfectly when you have strong review & testing team/process. Unless, you understand every possible issue, small or big, there is no way you can prevent it in future.

I remember one incident when we were developing a client based product. This was the time when we were hardcore developers. We have developed it with solid design (using OOPs methodology, those days it was bigger achievement) and used all the robust objects under VC++. The product got developed and I was assigned to do the testing of the product. I did the testing and passed it. However, one of my friends had some doubt on this and sat with me for another round of testing. We have tested this product by running the same application again and again. After testing it 14th time, there is an object in the menu got disappear!! Subsequently, for each cycle of testing objects are getting disappeared

The problem was we release an object which is created by us, however, as per the guideline this object will also get released by compiler as well.

You know, what the preventive measure is for this..Why testing is important for any product? Everybody knows about it but ignores it intentionally, especially when change is minor. Normally, people assume that for small changes why we should spend so much time on testing. The reason is, we are using so many tools and technologies available in the market without even knowing 20% of its usage and implication. Do we really aware of these tools being passed all the testing hurdles? Do we really understand the limitations? Do we understand internal complexity of it? Do we analyze if it would be fit for our project? Even though, if, it is known to the architect/designer, is this knowledge pass down to the developer level? The answer is NO!!!!!

That is reason why I would suggest having more review & testing time than earlier.

Testing team should be completely different from development team and directly reporting to client. That way, client can ensure correctness and completeness of development. It is a difficult thing to convince service industry, but, client should make themselves aware of these facts.

Software testing is to ensure the business and technical requirement is being met based on the test data and controlled as well as uncontrolled operating conditions.

Software testing is a process of creating test cases based on requirement (both business & technical) and run it under conditions to see the expected results. The result in software testing is either pass or fail. You can compare this to your exams!!!!! It is the examination time for development team.

There various methods that can be used for software product. But, the methods should be chosen based on following criteria: 1. Type of project/product (Development, Maintenance, Support,) 2. Size of the project/product 3. Type of talent in the team Testing is a major contributor in deciding on preventive action to be taken.

Companies normally put good amount of money on quality, six sigma, CMM assessment but one thing that they forget is all these processes will work perfectly when you have strong review & testing team/process. Unless, you understand every possible issue, small or big, there is no way you can prevent it in future.

I remember one incident when we were developing a client based product. This was the time when we were hardcore developers. We have developed it with solid design (using OOPs methodology, those days it was bigger achievement) and used all the robust objects under VC++. The product got developed and I was assigned to do the testing of the product. I did the testing and passed it. However, one of my friends had some doubt on this and sat with me for another round of testing. We have tested this product by running the same application again and again. After testing it 14th time, there is an object in the menu got disappear!! Subsequently, for each cycle of testing objects are getting disappeared

The problem was we release an object which is created by us, however, as per the guideline this object will also get released by compiler as well.

You know, what the preventive measure is for this..





iAutoblog the premier autoblogger software

Testing Lead to Preventive Measure

Why testing is important for any product? Everybody knows about it but ignores it intentionally, especially when change is minor. Normally, people assume that for small changes why we should spend so much time on testing. The reason is, we are using so many tools and technologies available in the market without even knowing 20% of its usage and implication. Do we really aware of these tools being passed all the testing hurdles? Do we really understand the limitations? Do we understand internal complexity of it? Do we analyze if it would be fit for our project? Even though, if, it is known to the architect/designer, is this knowledge pass down to the developer level? The answer is NO!!!!!

That is reason why I would suggest having more review & testing time than earlier.

Testing team should be completely different from development team and directly reporting to client. That way, client can ensure correctness and completeness of development. It is a difficult thing to convince service industry, but, client should make themselves aware of these facts.

Software testing is to ensure the business and technical requirement is being met based on the test data and controlled as well as uncontrolled operating conditions.

Software testing is a process of creating test cases based on requirement (both business & technical) and run it under conditions to see the expected results. The result in software testing is either pass or fail. You can compare this to your exams!!!!! It is the examination time for development team.

There various methods that can be used for software product. But, the methods should be chosen based on following criteria: 1. Type of project/product (Development, Maintenance, Support,) 2. Size of the project/product 3. Type of talent in the team Testing is a major contributor in deciding on preventive action to be taken.

Companies normally put good amount of money on quality, six sigma, CMM assessment but one thing that they forget is all these processes will work perfectly when you have strong review & testing team/process. Unless, you understand every possible issue, small or big, there is no way you can prevent it in future.

I remember one incident when we were developing a client based product. This was the time when we were hardcore developers. We have developed it with solid design (using OOPs methodology, those days it was bigger achievement) and used all the robust objects under VC++. The product got developed and I was assigned to do the testing of the product. I did the testing and passed it. However, one of my friends had some doubt on this and sat with me for another round of testing. We have tested this product by running the same application again and again. After testing it 14th time, there is an object in the menu got disappear!! Subsequently, for each cycle of testing objects are getting disappeared

The problem was we release an object which is created by us, however, as per the guideline this object will also get released by compiler as well.

You know, what the preventive measure is for this..Why testing is important for any product? Everybody knows about it but ignores it intentionally, especially when change is minor. Normally, people assume that for small changes why we should spend so much time on testing. The reason is, we are using so many tools and technologies available in the market without even knowing 20% of its usage and implication. Do we really aware of these tools being passed all the testing hurdles? Do we really understand the limitations? Do we understand internal complexity of it? Do we analyze if it would be fit for our project? Even though, if, it is known to the architect/designer, is this knowledge pass down to the developer level? The answer is NO!!!!!

That is reason why I would suggest having more review & testing time than earlier.

Testing team should be completely different from development team and directly reporting to client. That way, client can ensure correctness and completeness of development. It is a difficult thing to convince service industry, but, client should make themselves aware of these facts.

Software testing is to ensure the business and technical requirement is being met based on the test data and controlled as well as uncontrolled operating conditions.

Software testing is a process of creating test cases based on requirement (both business & technical) and run it under conditions to see the expected results. The result in software testing is either pass or fail. You can compare this to your exams!!!!! It is the examination time for development team.

There various methods that can be used for software product. But, the methods should be chosen based on following criteria: 1. Type of project/product (Development, Maintenance, Support,) 2. Size of the project/product 3. Type of talent in the team Testing is a major contributor in deciding on preventive action to be taken.

Companies normally put good amount of money on quality, six sigma, CMM assessment but one thing that they forget is all these processes will work perfectly when you have strong review & testing team/process. Unless, you understand every possible issue, small or big, there is no way you can prevent it in future.

I remember one incident when we were developing a client based product. This was the time when we were hardcore developers. We have developed it with solid design (using OOPs methodology, those days it was bigger achievement) and used all the robust objects under VC++. The product got developed and I was assigned to do the testing of the product. I did the testing and passed it. However, one of my friends had some doubt on this and sat with me for another round of testing. We have tested this product by running the same application again and again. After testing it 14th time, there is an object in the menu got disappear!! Subsequently, for each cycle of testing objects are getting disappeared

The problem was we release an object which is created by us, however, as per the guideline this object will also get released by compiler as well.

You know, what the preventive measure is for this..





iAutoblog the premier autoblogger software

Financial Software Forex Trading - Get Financial Freedom

Financial Software Forex Trading

Are you looking for a forex trading tool but finding difficult to pick the best one for you? Well, there are so much a lot of people who find it quite difficult to choose the current software. The number one explanation for presently is it takes a lot of time to inspection in shape to get the ultimate one for you. Financial Software Forex Trading

If you consideration that it is not necessary to go for it, when that happens you are wrong. You should constantly keep in mind too you have to get the application if you in fact would like to be on the look for earning in the market. So, let us experience a happy article on the now software. Simple method to make profit The main reason why people go for forex trading software is that it is the simplest process for making profit when it comes to investing in shares and stocks. Financial Software Forex Trading

But the difficulty lies in the fact that there are lots of forex trading software that are found in the market due to which it becomes a big problem in choosing the best one. But if you are able to make a good research, you would be able to get one for you. You should be able to distinguish from the different software available in the market and always keep open your mind so that you do not fall prey to the wrong software. Financial Software Forex Trading

When you go for this software, you would find that there are automatic softwares available for you which run 24 hours a day as well as 7 days a week. What's more, the automatic forex software never sleeps and carries on conducting trades at any time, whether be it a day or night. It helps to buy at a lower price and then sell it at a higher price even when you are asleep. Financial Software Forex Trading

Another feature of this software is that it is self adapting where it updates itself and finds the best one for you. There are software where it uses automated online exchange information and helps in making the trade quick without taking much of your time. Very affordable Now when it comes to the price of forex trading software, there is some misconception that is held by many novice traders who think that they have to burn a hole in their pocket to get one for them. But in reality, it is quite affordable and it makes the process of currency tradin g very efficient as well. Financial Software Forex Trading

It helps in eliminating human errors and also other problems that are associated with it. So, it is very inexpensive and you can always get one for you without any second thought. When it comes to speed, it is very fast and the transaction takes just a fraction of time. Stop what you are doing RIGHT NOW and get your Life Changing Financial Software Forex Trading Program. It'll change your Life Forever!





iAutoblog the premier autoblogger software

Forex Trade Software - Benefits of Forex Trading Software

Forex Trade Software

Forex trading software has a number of benefits. It can automate multi of the common tasks the current you will would like to perform when investing. Using this type of a program will allow its user to look at trends and statistical analysis so that users can produce bigger decisions. Forex Trade Software

It additionally allows you to trade online directly and overall, simplifies the investing process. If you choose to invest on your own, the process can get pretty redundant and sometimes confusing, especially for someone who is just starting out. Forex trading software will make the process much easier and streamlined. Simply log into your computer, execute trades, look at the past transaction history and get advice. Forex Trade Software

This can prove to be extremely helpful and may give you the little extra something that you need to make bigger profits. The ability to look at trends, your personal history and perform statistical analysis allows you to take a much more educated and deliberate approach to investing. In this way, its investor will be able to do much more then simply making guesses. Forex Trade Software

Instead, with the tools in hand, it will help you reap more profits and have greater success. Your personal information and that of the market will be presented in a well formatted and easy to read manner. This will allow you to know exactly where your investment portfolio stands, when and if there are any profits and how the market is performing. Forex Trade Software

Forex trading software will also allow you to trade directly online. In many cases, you will be able to do just about everything needed in order to get your trading career started and on track. Overall, the biggest advantage of this type of applications is that it simplifies the entire process. Most of the analytical tools that are built into the program will give its user the greatest chance of being successful in making profitable trades. Forex Trade Software

Of course, this will require that one picks the right product. The wrong one will have absolutely no benefit and may cause you to lose money. Stop what you are doing RIGHT NOW and get your Life Changing Forex Trade Software Program. It'll change your Life Forever!





iAutoblog the premier autoblogger software

Internet A Medium or a Message

The State of the Net

An Interim Report about the Future of the Internet

Who are the participants who constitute the Internet?

Users - connected to the net and interacting with it

The communications lines and the communications equipment

The intermediaries (e.g. the suppliers of on-line information or access providers).

Hardware manufacturers

Software authors and manufacturers (browsers, site development tools, specific applications, smart agents, search engines and others).

The "Hitchhikers" (search engines, smart agents, Artificial Intelligence - AI - tools and more)

Content producers and providers

Suppliers of financial wherewithal (currently - corporate and institutional cash gradually being replaced by advertising money)

The fate of each of these components - separately and in solidarity - will determine the fate of the Internet.

The first phase of the Internet's history was dominated by computer wizards. Thus, any attempt at predicting its future dealt mainly with its hardware and software components.

Media experts, sociologists, psychologists, advertising and marketing executives were left out of the collective effort to determine the future face of the Internet.

As far as content is concerned, the Internet cannot be currently defined as a medium. It does not function as one - rather it is a very disordered library, mostly incorporating the writings of non-distinguished megalomaniacs. It is the ultimate Narcissistic experience. The forceful entry of publishing houses and content aggregators is changing this dismal landscape, though.

Ever since the invention of television there hasn't been anything as begging to become a medium as the Internet.

Three analogies spring to mind when contemplating the Internet in its current state:

A chaotic library

A neural network or the latter day equivalent of previous networks (telegraph, telephony, railways)

A new continent

These metaphors prove to be very useful (even business-wise). They permit us to define the commercial opportunities embedded in the Internet.

Yet, they fail to assist us in predicting its future in its transformation into a medium.

How does an invention become a medium? What happens to it when it does become one? What is the thin line separating the initial functioning of the invention from its transformation into a new medium? In other words: when can we tell that some technological advance gave birth to a new medium?

This work also deals with the image of the Internet once transformed into a medium.

The Internet has the most unusual attributes in the history of media.

It has no central structure or organization. It is hardware and software independent. It (almost) cannot be subjected to legislation or to regulation. Consider the example of downloading music from the internet - is it tantamount to an act of recording music (a violation of copyright laws)? This has been the crux of the legal battle between Diamond Multimedia (the manufacturers of the Rio MP3 device), MP3.com and Napster and the recording industry in America.

The Internet's data transfer channels are not linear - they are random. Most of its "broadcast" cannot be "received" at all. It allows for the narrowest of narrowcasting through the use of e-mail mailing lists, discussion groups, message boards, private radio stations, and chats. And this is but a small portion of an impressive list of oddities. These idiosyncrasies will also shape the nature of the Internet as a medium. Growing out of bizarre roots - it is bound to yield strange fruit as a medium.

So what business opportunities does the Internet represent?

I believe that they are to be found in two broad categories:

Software and hardware related to the Internet's future as a medium

Content creation, management and licencing

The Map of Terra Internetica

The Users

How many Internet users are there? How many of them have access to the Web (World Wide Web - WWW) and use it? There are no unequivocal statistics. Those who presume to give the answers (including the ISOC - the Internet SOCiety) - rely on very partial and biased resources. Others just bluff.

Yet, everyone seems to agree that there are, at least, 100 million active participants in North America (the Nielsen and Commerce-Net reports).

The future is, inevitably, even more vague than the present. Authoritative consultancy firms predict 66 million active users in 10 years time. IBM envisages 700 million users. MCI is more modest with 300 million. At the end of 1999 there were 130 million registered (though not necessarily active) users.

The Internet - an Elitist and Chauvinistic Medium

The average user of the Internet is young (30), with an academic background and high income. The percentage of the educated and the well-to-do among the users of the Web is three times as high as their proportion in the population. This is fast changing only because their children are joining them (6 million already had access to the Internet at the end of 1996 - and were joined by another 24 million by the end of the decade). This may change only due to presidential initiatives to bridge the "digital divide" (from Al Gore's in the USA to Mahatir Mohammed's in Malaysia), corporate largesse and institutional involvement (e.g., Open Society in Eastern Europe, Microsoft in the USA). These efforts will spread the benefits of this all-powerful tool among the less privileged. A bit less than 50% of all users are men but they are responsible for 60% of the activity in the net (as measured by traffic).

Women seem to limit themselves to electronic mail (e-mail) and to electronic shopping of goods and services, though this is changing fast. Men prefer information, either due to career requirements or because knowledge is power.

Most of the users are of the "experiencer" variety. They are leaders of social change and innovative. This breed inhabits universities, fashionable neighbourhoods and trendy vocations. This is why some wonder if the Internet is not just another fad, albeit an incredibly resilient and promising one.

Most users have home access to the Internet - yet, they still prefer to access it from work, at their employer's expense, though this preference is slight and being eroded. Most users are, therefore, exploitative in nature. Still, we must not forget that there are 37 million households of the self-employed and this possibly distorts the statistical picture somewhat.

The Internet - A Western Phenomenon

Not African, not Asian (with the exception of Israel and Japan), not Russian , nor a Third World phenomenon. It belongs squarely to the wealthy, sated world. It is the indulgence of those who have everything and whose greatest concern is their choice of nightly entertainment. Between 50-60% of all Internet users live in the USA, 5-10% in Canada. The Internet is catching on in Europe (mainly in Germany and in Scandinavia) and, in its mobile form (i-mode) in Japan. The Internet lost to the French Minitel because the latter provides more locally relevant content and because of high costs of communications and hardware.

Communications

Most computer owners still possess a 28,800 bps modem. This is much like driving a bicycle on a German Autobahn. The 56,600 bps is gradually replacing its slower predecessor (48% of computers with modems) - but even this is hardly sufficient. To begin to enjoy video and audio (especially the former) - data transfer rates need to be 50 times faster.

Half the households in the USA have at least 2 telephones and one of them is usually dedicated to data processing (faxes or fax-modems).

The ISDN could constitute the mid-term solution. This data transfer network is fairly speedy and covers 70% of the territory of the USA. It is growing by 100% annually and its sales topped 10 billion USD in 1995/6.

Unfortunately, it is quite clear that ISDN is not THE answer. It is too slow, too user-unfriendly, has a bad interface with other network types, it requires special hardware. There is no point in investing in temporary solutions when the right solution is staring the Internet in the face, though it is not implemented due to political circumstances.

A cable modem is 80 times speedier than the ISDN and 700 times faster than a 14,400 bps modem. However, it does have problems in accommodating a two-way data transfer. There is also need to connect the fibre optic infrastructure which characterizes cable companies to the old copper coaxial infrastructure which characterizes telephony. Cable users engage specially customized LANs (Ethernet) and the hardware is expensive (though equipment prices are forecast to collapse as demand increases). Cable companies simply did not invest in developing the technology. The law (prior to the 1996 Communications Act) forbade them to do anything that was not one way transfer of video via cables. Now, with the more liberal regulative environment, it is a mere question of time until the technology is found.

Actually, most consumers single out bad customer relations as their biggest problem with the cable companies - rather than technology.

Experiments conducted with cable modems led to a doubling of usage time (from an average of 24 to 47 hours per month per user) which was wholly attributable to the increased speed. This comes close to a cultural revolution in the allocation of leisure time. Numerically speaking: 7 million households in the USA are fitted with a two-way data transfer cable modems. This is a small number and it is anyone's guess if it constitutes a critical mass. Sales of such modems amount to 1.3 billion USD annually.

50% of all cable subscribers also have a PC at home. To me it seems that the merging of the two technologies is inevitable.

Other technological solutions - such as DSL, ADSL, and the more promising satellite broadband - are being developed and implemented, albeit slowly and inefficiently. Coverage is sporadic and frustrating waiting periods are measured in months.

Hardware and Software

Most Internet users (82%) work with the Windows operating system. About 11% own a Macintosh (much stronger graphically and more user-friendly). Only 7% continue to work on UNIX based systems (which, historically, fathered the Internet) - and this number is fast declining. A strong entrant is the free source LINUX operating system.

Virtually all users surf through a browsing software. A fast dwindling minority (26%) use Netscape's products (mainly Navigator and Communicator) and the majority use Microsoft's Explorer (more than 60% of the market). Browsers are now free products and can be downloaded from the Internet. As late as 1997, it was predicted by major Internet consultancy firms that browser sales will top $4 billion by the year 2000. Such misguided predictions ignored the basic ethos of the Internet: free products, free content, free access.

Browsers are in for a great transformation. Most of them are likely to have 3-D, advanced audio, telephony / voice / video mail (v-mail), instant messaging, e-mail, and video conferencing capabilities integrated into the same browsing session. They will become self-customizing, intelligent, Internet interfaces. They will memorize the history of usage and user preferences and adapt themselves accordingly. They will allow content-specificity: unidentifiable smart agents will scour the Internet, make recommendations, compare prices, order goods and services and customize contents in line with self-adjusting user profiles.

Two important technological developments must be considered:

PDAs (Personal Digital Assistants) - the ultimate personal (and office) communicators, easy to carry, they provide Internet (access) Everywhere, independent of suppliers and providers and of physical infrastructure (in an aeroplane, in the field, in a cinema).

The second trend: wireless data transfer and wireless e-mail, whether through pagers, cellular phones, or through more sophisticated apparatus and hybrids such as smart phones. Geotech's products are an excellent example: e-mail, faxes, telephone calls and a connection to the Internet and to other, public and corporate, or proprietary, databases - all provided by the same gadget. This is the embodiment of the electronic, physically detached, office. Wearable computing should be considered a part of this "ubiquitous or pervasive computing" wave.

We have no way of gauging - or intelligently guessing - the part of the mobile Internet in the total future Internet market but it is likely to outweigh the "fixed" part. Wireless internet meshes well with the trend of pervasive computing and the intelligent home and office. Household gadgets such as microwave ovens, refrigerators and so on will connect to the internet via a wireless interface to cull data, download information, order goods and services, report their condition and perform basic maintenance functions. Location specific services (navigation, shopping recommendations, special discounts, deals and sales, emergency services) depend on the technological confluence between GPS (stallite-based geolocation technology) and wireless Internet.

Suppliers and Intermediaries

"Parasitic" intermediaries occupy each stage in the Internet's food chain.

Access to the Internet is still provided by "dumb pipes" - the Internet Service Providers (ISP)

Content is still the preserve of content suppliers and so on.

Some of these intermediaries are doomed to gradually fade or to suffer a substantial diminishing of their share of the market. Even "walled gardens" of content (such as AOL) are at risk.

By way of comparison, even today, ISPs have four times as many subscribers (worldwide) as AOL. Admittedly, this adversely affects the quality of the Internet - the infrastructure maintained by the phone companies is slow and often succumbs to bottlenecks. The unequivocal intention of the telephony giants to become major players in the Internet market should also be taken into account. The phone companies will, thus, play a dual role: they will provide access to their infrastructure to their competitors (sometimes, within a real or actual monopoly) - and they will compete with their clients. The same can be said about the cable companies. Controlling the last mile to the user's abode is the next big business of the Internet. Companies such as AOL are disadvantaged by these trends. It is imperative for AOL to obtain equal access to the cable company's backbone and infrastructure if it wants to survive. Hence its merger with Time Warner.

No wonder that many of the ISPs judge this intrusion on their turf by the phone and cable companies to constitute unfair competition. Yet, one should not forget that the barriers to entry are very low in the ISP market. It takes a minimal investment to become an ISP. 200 modems (which cost 200 USD each) are enough to satisfy the needs of 2000 average users who generate an income of 500,000 USD per annum to the ISP. Routers are equally as cheap nowadays. This is a nice return on the ISP's capital, undoubtedly.

The Hitchhikers

The Web houses the equivalent of 100 billion pages. Search Engine applications are used to locate specific information in this impressive, constantly proliferating library. They will be replaced, in the near future, by "Knowledge Structures" - gigantic encyclopaedias, whose text will contain references (hyperlinks) to other, relevant, sites. The far future will witness the emergence of the "Intelligent Archives" and the "Personal Newspapers" (read further for detailed explanations). Some software applications will summarize content, others will index and automatically reference and hyperlink texts (virtual bibliographies). An average user will have an on-going interest in 500 sites. Special software will be needed to manage address books ("bookmarks", "favourites") and contents ("Intelligent Addressbooks"). The phenomenon of search engines dedicated to search a number of search engines simultaneously will grow ("Hyper- or meta- engines"). Meta-engines will work in the bac kground and download hyperlinks and advertising (the latter is essential to secure the financial interest of site developers and owners). Statistical software which tracks ("how long was what done"), monitors ("what did they do while in the site") and counts ("how many") visitors to sites already exists. Some of these applications have back-office facilities (accounting, follow-up, collections, even tele-marketing). They all provide time trails and some allow for auditing.

This is but a small fragment of the rapidly developing net-scape: people and enterprises who make a living off the Internet craze rather than off the Internet itself. Everyone knows that there is more money in lecturing about how to make money on the Internet - than in the Internet itself. This maxim still holds true despite the 32 billion US dollars in E-commerce in 1998. Business to Consumer (B2C) sales grow less vigorously than Business to Business (B2B) sales and are likely to suffer another blow with the advent of Peer to Peer (P2P) computer networks. The latter allow PCs to act as servers and thus enable the swapping of computer files asmong connected users (with or without a central directory).

Content Suppliers

This is the underprivileged sector of the Internet. They all lose money (even e-tailers which offer basic, standardized goods - books, CDs - with the exception, until September 11, of sites connected to tourism). No one thanks them for content produced with the investment of a lot of effort and a lot of money. A really qualitative, fully commerce enabled site costs up to 5,000,000 USD, excluding site maintenance and customer and visitor services. Content providers are constantly criticized for lack of creativity or for too much creativity. More and more is asked of them. They are exploited by intermediaries, hitchhikers and other parasites. This is all an off-shoot of the ethos of the Internet as a free content area.

More than 100 million men and women constantly access the Web - but this number stands to grow (the median prediction: 300 million). Yet, while the Web is used by 35% of those with access to the Internet - e-mail is used by more than 60%. E-mail is by far the most common function ("killer app") and specialized applications (Eudora, Internet Mail, Microsoft Exchange) - free or ad sponsored - keep it accessible to all and user-friendly.

Most of the users like to surf (browse, visit sites) the net without reason or goal in mind. This makes it difficult to apply traditional marketing techniques.

What is the meaning of "targeted audiences" or "market shares" in this context?

If a surfer visits sites which deal with aberrant sex and nuclear physics in the same session - what to make of it?

The public and legislative backlash against the gathering of surfers' data by Internet ad agencies and other web sites - has led to growing ignorance regarding the profile of Internet users, their demography, habits, preferences and dislikes.

People like the very act of surfing. They want to be entertained, then they use the Internet as a working tool, mostly in the service of their employer, who, usually foots the bill. Users love free downloads (mainly software).

"Free" is a key word on the Internet: it used to belong to the US Government and to a bunch of universities. Users like information, with emphasis on news and data about new products. But they do not like to shop on the net - yet. Only 38% of all surfers made a purchase during 1998.

67% of them adore virtual sex. 50% of the sites most often visited are porn sites (this is reminiscent of the early days of the Video Cassette Recorder - VCR). People dedicate the same amount of time to watching video cassettes or television as they do to surfing the net. The Internet seems to cannibalize television.

Sex is followed by music, sports, health, television, computers, cinema, politics, pets and cooking sites. People are drawn to interactive games. The Internet will shortly enable people to gamble, if not hampered by legislation. 10 billion USD in gambling money are predicted to pass through the net. This makes sense: nothing like a computer to provide immediate (monetary and psychological) rewards.

Commerce on the net is another favourite. The Internet is a perfect medium for the sale of software and other digital products (e-books). The problem of data security is on its way to being solved with the SET (or other) world standard.

As early as 1995, the Internet had more than 100 virtual shopping malls visited by 2.5 million shoppers (and probably double this number in 1996).

The predictions for 1999 were between 1-5 billion USD of net shopping (plus 2 billion USD through on-line information providers, such as CompuServe and AOL) - proved woefully inaccurate. The actual number in 1998 was 7 times the prediction for 1999.

It is also widely believed that circa 20% of the family budget will pass through the Internet as e-money and this amounts to 150 billion USD.

The Internet will become a giant inter-bank clearing system and varied ATM type banking and investment services will be provided through it. Basically, everything can be done through the Internet: looking for a job, for instance.

Yet, the Internet will never replace human interaction. People are likely to prefer personal banking, window shopping and the social experience of the shopping mall to Internet banking and e-commerce, or m-commerce.

Some sites already sport classified ads. This is not a bad way to defray expenses, though most classified ads are free (it is the advertising they attract that matters).

Another developing trend is website-rating and critique. It will be treated the way today's printed editions are. It will have a limited influence on the consumption decisions of some users. Browsers already sport buttons labelled "What's New" and "What's Hot". Most Search Engines recommend specific sites. Users are cautious. Studies discovered that no user, no matter how heavy, has consistently re-visited more than 200 sites, a minuscule number. The 10 most popular web sites (Yahoo!, MSN, etc.) attracted more than 50% of all Internet traffic. Site recommendation services often produce random - at times, wrong - selections for their user. There are also concerns regarding privacy issues. The backlah against Amazon's "readers' circles" is an example.

Web Critics, who work today mainly for the printed press, will publish their wares on the net and will link to intelligent software which will hyperlink, recommend and refer. Some web critics will be identified with specific applications - really, expert systems which will incorporate their knowledge and experience.

The Money

Where will the capital needed to finance all these developments come from?

Again, there are two schools:

One says that sites will be financed through advertising - and so will search engines and other applications accessed by users.

Certain ASPs (Application Service Providers which rent out access to application software which resides on their servers) are considering this model.

The second version is simpler and allows for the existence of non-commercial content.

It proposes to collect negligible sums (cents or fractions of cents) from every user for every visit ("micro-payments") or a subscription fee. These accumulated cents or subscription fees will enable the owners of old sites to update and to maintain them and encourage entrepreneurs to develop new ones. Certain content aggregators (especially of digital textbooks) have adopted this model (Questia, Fathom).

The adherents of the first school pointed at the 5 million USD invested in advertising during 1995 and to the 60 million or so invested during 1996.

Its opponents point exactly at the same numbers: ridiculously small when contrasted with more conventional advertising modes. The potential of advertising on the net is limited to 1.5 billion USD annually in 1998, thundered the pessimists (many thought that even half that would be very nice). The actual figure was double the prediction but still woefully small and inadequate to support the Internet's content development.

Compare these figures to the sale of Internet software ($4 billion), Internet hardware ($3 billion), Internet access provision ($4.2 billion) in 1995.

Hembrecht and Quist estimated that Internet related industries scooped up 23.2 billion USD annually (A report released in mid-1996).

And what follows advertising is hardly more enocuraging.

The consumer interacts and the product is delivered to him. This - the delivery phase - is a slow and enervating epilogue to the exciting affair of ordering through the net at the speed of light. Too many consumers still complain that they do not receive what they ordered, or that delivery is late and products defective.

The solution may lie in the integration of advertising and content. Pointcast, for instance, integrated advertising into its news broadcasts, continuously streamed to the user's screen, even when inactive (they provided a downloadable active screen saver and ticker in a "push technology"). Downloading of digital music, video and text (e-books) will lead to immediate gratification of the consumer and will increase the efficacy of advertising.

Whatever the case may be, a uniform, agreed upon system of rating as a basis for charging advertisers, is sorely needed. There is also the question of what does the advertiser pay for?

Many advertisers (Procter and Gamble, for instance) refuse to pay according to the number of hits or impressions (=entries, visits to a site). They agree to pay only according to the number of the times that their advertisement was hit (page views).

This different basis for calculation is likely to upset all revenue scenarios.

Very few sites of important, respectable newspapers are on a subscription basis. Dow Jones (Wall Street Journal) and The Economist, to mention but two.

Will this become the prevailing trend?

The Internet as a Metaphor

Three metaphors come to mind when considering the Internet "philosophically".

The Internet as a Chaotic Library

1. The Problem of Cataloguing

The Internet is an assortment of billions of pages containing information. Some of them are visible and others are generated from hidden databases by users' requests ("Invisible Internet").

The Internet displays no discernible order, classification, or categorization. As opposed to "classical" libraries, no one has invented a cataloguing standard (remember Dewey?). This is so needed that it is amazing that it has not been invented yet. Some sites indeed apply the Dewey Decimal Syatem (Suite101). Others default to a directory structure (Open Directory, Yahoo!, Look Smart and others).

Had such a standard existed (an agreed upon numerical cataloguing method) - each site would have self-classified. Sites would have an interest to do so to increase their penetration rates and their visibility. This, naturally, would have eliminated the need for today's clunky, incomplete and (highly) inefficient search engines.

A site whose number starts with 900 will be immediately identified as dealing with history and multiple classification will be encouraged to allow finer cross-sections to emerge. An example of such an emerging technology of "self classification" and "self-publication" (though limited to scholarly resources) is the "Academic Resource Channel" by Scindex.

Users will not be required to remember reams of numbers. Future browsers will be akin to catalogues, very much like the applications used in modern day libraries. Compare this utopia to the current dystopy. Users struggle with reams of irrelevant material to finally reach a partial and disappointing destination. At the same time, there likely are web sites which exactly match the poor user's needs. Yet, what currently determines the chances of a happy encounter between user and content - are the whims of the specific search engine used and things like meta-tags, headlines, a fee paid, or the right opening sentences.

2. Screen versus Page

The computer screen, because of physical limitations (size, the fact that it has to be scrolled) fails to effectively compete with the printed page. The latter is still the most ingenious medium yet invented for the storage and release of textual information. Granted: a computer screen is better at highlighting discrete units of information. So, this draws the batlle lines: structures (printed pages) versus units (screen), the continuous and easily reversible versus the discrete.

The solution is an efficient way to translate computer screens to printed matter. It is hard to believe, but no such thing exists. Computer screens are still hostile to off-line printing. In other words: if a user copies information from the Internet to his Word Processor (or vice versa, for that matter) - he ends up with a fragmented, garbage-filled and non-aesthetic document.

Very few site developers try to do something about it - even fewer succeed.

3. The Internet and the CD-ROM

One of the biggest mistakes of content suppliers is that they do not mix contents or have a "static-dynamic interaction".

The Internet can now easily interact with other media (especially with audio CDs and with CD-ROMs) - even as the user surfs.

Examples abound:

A shopping catalogue can be distributed on a CD-ROM by mail. The Internet Site will allow the user to order a product previously selected from the catalogue, while off-line. The catalogue could also be updated through the site (as is done with CD-ROM encyclopedias).

The advantages of the CD-ROM are clear: very fast access time (dozens of times faster than the access to a site using a dial up connection) and a data storage capacity tens of times bigger than the average website.

Another example: a CD-ROM can be distributed, containing hundreds of advertisements. The consumer will select the ad that he wants to see and will connect to the Internet to view a relevant video.

He could then also have an interactive chat (or a conference) with a salesperson, receive information about the company, about the ad, about the advertising agency which created the ad - and so on.

CD-ROM based encyclopedias (such as the Britannica, Encarta, Grolier) already contain hyperlinks which carry the user to sites selected by an Editorial Board.

But CD-ROMs are probably a doomed medium. This industry chose to emphasize the wrong things. Storage capacity increased exponentially and, within a year, desktops with 80 Gb hard disks will be common. Moreover, the Network Computer - the stripped down version of the personal computer - will put at the disposal of the average user terabytes in storage capacity and the processing power of a supercomputer. What separates computer users from this utopia is the communication bandwidth. With the introduction of radio, statellite, ADSL broadband services, cable modems and compression methods - video (on demand), audio and data will be available speedily and plentifully.

The CD-ROM, on the other hand, is not mobile. It requires installation and the utilization of sophisticated hardware and software. This is no user friendly push technology. It is nerd-oriented. As a result, CD-ROMs are not an immediate medium. There is a long time lapse between the moment they are purchased and the moment the first data become accessible to the user. Compare this to a book or a magazine. Data in these oldest of media is instantly available to the user and allows for easy and accurate "back" and "forward" functions.

Perhaps the biggest mistake of CD-ROM manufacturers has been their inability to offer an integrated hardware and software package. CD-ROMs are not compact. A Walkman is a compact hardware-cum-software package. It is easily transportable, it is thin, it contains numerous, user-friendly, sophisticated functions, it provides immediate access to data. So does the discman or the MP3-man. This cannot be said of the CD-ROM. By tying its future to the obsolete concept of stand-alone, expensive, inefficient and technologically unreliable personal computers - CD-ROMs have sentenced themselves to oblivion (with the possible exception of reference material).

4. On-line Reference Libraries

These already exist. A visit to the on-line Encyclopaedia Britannica exemplifies some of the tremendous, mind boggling possibilities:

Each entry is hyperlinked to sites on the Internet which deal with the same subject matter. The sites are carefully screened (though more detailed descriptions of each site should be available - they could be prepared either by the staff of the encyclopaedia or by the site owner). Links are available to data in various forms, including audio and video. Everything can be copied to the hard disk or to CD-ROMs.

This is a new conception of a knowledge centre - not just an assortment of material. It is modular, can be added on and subtracted from. It can be linked to a voice Q&A centre. Queries by subscribers can be answered by e-mail, by fax, posted on the site, hard copies can be sent by post. This "Trivial Pursuit" service could be very popular - there is considerable appetite for "Just in Time Information". The Library of Congress - together with a few other libraries - is in the process of making just such a service available to the public (CDRS - Collaborative Digital Reference Service).

5. The Feedback Option

Hard to believe, but very few sites encourage their guests to express an opinion about the site, its contents and its aesthetics. This indicates an ossified mode of thinking about the most dynamic mass medium ever created, the only interactive mass medium yet. Each site must absolutely contain feedback and rating questionnaires. It has the side benefit of creating a database of the visitors to the site.

Moreover, each site can easily become a "knowledge centre".

Let us consider a site dedicated to advertising and marketing:

It can contain feedback questionnaires (what do you think about the site, suggestions for improvement, mailto and leave message facilities, etc.)

It can contain rating questionnaires (rate these ads, these TV or radio shows, these advertising campaigns).

It can allocate some space to clients to create their home pages in (these home pages could lead to their sites, to other sites, to other sections of the host site - and, in any case, will serve as a display of the creative talent of the site owners). This will give the site owners a picture of the distribution of the areas of interest of the visitors to the site.

The site can include statistical, tracking and counter software.

Such a site can refer to hundreds of useful shareware applications (which deal with different aspects of advertising and marketing, for instance). Developers of applications will be able to use the site to promote their products. Other practical applications could also be referred to from - or reside on - the site (browsers, games, search engines).

And all this can be organized in a portal structure (for instance, by adopting the open software of the Open Directory Project).

6. Internet Derived CD-ROMS

The Internet is an enormous reservoir of freely available, public domain, information.

With a minimal investment, this information can be gathered into coherent, theme oriented, cheap CD-ROMs. Each such CD-ROM can contain:

Addresses of web sites specific to the subject matter

The first pages of each of these sites

Hyperlinks to each of the sites

A browser

Access to all the important search engines

Recommended search strings (it is extremely difficult to formulate a successful search in the Internet, it takes expertise. "Ready-made searches" will be a hit in the future, as the number of sites grows)

A dictionary of professional terms, a speller and a thesaurus

A list of general reference sites

Shareware specific to the field

7. Publishing

The Internet is the world's largest "publisher", by far. It "publishes" FAQs (Frequent Answers and Questions regarding almost every technical matter in the world), e-zines (electronic versions of magazines, not a very profitable pursuit), the electronic versions of dailies (together with on-line news and information services), reference and other e-books, monographs, articles and minutes of discussions ("threads"), among other types of material.

Publishing an e-zine has a few advantages: it promotes the sales of the printed edition, it helps to sign on subscribers and it leads to the sale of advertising space. The electronic archive function (see next section) saves the need to file back issues, the space required to do so and the irritating search for data items.

The future trend is a combined subscription: electronic (mainly for the archival value and the ability to hyperlink to additional information) and printed (easier to browse current issue).

The electronic daily presents other advantages:

It allows for immediate feedback and for flowing, almost real-time, communication between writers and readers. The electronic version, therefore, acquires a gyroscopic function: a navigation instrument, always indicating deviations from the "right" course. The content can be instantly updated and immediacy has its premium (remember the Lewinsky affair?).

Strangely, this (conventional) field was the first to develop a "virtual reality" facet. There are virtual "magazine stalls". They look exactly like the real thing and the user can buy a paper using his mouse.

Specialty hand held devices already allow for downloading and storage of vast quantities of data (up to 4000 print pages). The user gains access to libraries containing hundreds of texts, adapted to be downloaded, stored and read by the specific device. Again, a convergence of standards is to be expected in this field as well (the final contenders will probably be Adobe's PDF against Microsoft's MS-Reader).

Broadly, e-books are treated either as:

Continuation of print books (p-books) by other means

or as

A whole new publishing universe.

Since p-books are a more convenient medium then e-books - they will prevail in any straightforward "medium replacement" or "medium displacement" battle.

In other words, if publishers will persist in the simple and straightforward conversion of p-books to e-books - then e-books are doomed. They are simply inferior to the price, comfort, tactile delights, browseability and scanability of p-books.

But e-books - being digital - open up a vista of hitherto neglected possibilities. These will only be enhanced and enriched by the introduction of e-paper and e-ink. Among them:

Hyperlinks within the e-book and without it - to web content, reference works, etc.

Embedded instant shopping and ordering links

Divergent, user-interactive, decision driven plotlines

Interaction with other e-books (using a wireless standard) - collaborative authoring

Interaction with other e-books - gaming and community activities

Automatically or periodically updated content

Multimedia

Database, Favourites and History Maintenance (reading habits, shopping habits, interaction with other readers, plot related decisions and much more)

Automatic and embedded audio conversion and translation capabilities

Full wireless piconetworking and scatternetworking capabilities

The technology is still not fully there. Wars rage in both the wireless and the ebook realms. Platforms compete. Standards clash. Gurus debate. But convergence is inevitable and with it the e-book of the future.

8. The Archive Function

The Internet is also the world's biggest cemetery: tens of thousands of deadbeat sites, still accessible - the "Ghost Sites" of this electronic frontier.

This, in a way, is collective memory. One of the Internet's main functions will be to preserve and transfer knowledge through time. It is called "memory" in biology - and "archive" in library science. The history of the Internet is being documented by search engines (Google) and specialized services (Alexa) alike.

The Internet as a Collective Brain

Drawing a comparison from the development of a human baby - the human race has just commenced to develop its neural system.

The Internet fulfils all the functions of the Nervous System in the body and is, both functionally and structurally, pretty similar. It is decentralized, redundant (each part can serve as functional backup in case of malfunction). It hosts information which is accessible in a few ways, it contains a memory function, it is multimodal (multimedia - textual, visual, audio and animation).

I believe that the comparison is not superficial and that studying the functions of the brain (from infancy to adulthood) - amounts to perusing the future of the Net itself.

1. The Collective Computer

To carry the metaphor of "a collective brain" further, we would expect the processing of information to take place in the Internet, rather than inside the end-user's hardware (the same way that information is processed in the brain, not in the eyes). Desktops will receive the results and communicate with the Net to receive additional clarifications and instructions and to convey information gathered from their environment (mostly, from the user).

This is part fo the philosophy of the JAVA programming language. It deals with applets - small bits of software - and links different computer platforms by means of software.

Put differently:

Future servers will contain not only information (as they do today) - but also software applications. The user of an application will not be forced to buy it. He will not be driven into hardware-related expenditures to accommodate the ever growing size of applications. He will not find himself wasting his scarce memory and computing resources on passive storage. Instead, he will use a browser to call a central computer. This computer will contain the needed software, broken to its elements (=applets, small applications). Anytime the user wishes to use one of the functions of the application, he will siphon it off the central computer. When finished - he will "return" it. Processing speeds and response times will be such that the user will not feel at all that it is not with his own software that he is working (the question of ownership will be very blurred in such a world). This technology is available and it provoked a heated debated about the future shape of the computi ng industry as a whole (desktops - really power packs - or network computers, a little more than dumb terminals). Applications are already offered to corporate users by ASPs (Application Service Providers).

In the last few years, scientists put the combined power of the computers linked to the internet at any given moment to perform astounding feats of distributed parallel processing. Millions of PCs connected to the net co-process signals from outer space, meteorological data and solve complex equations. This is a prime example of a collective brain in action.

2. The Intranet - a Logical Extension of the Collective Computer

LANs (Local Area Networks) are no longer a rarity in corporate offices. WANs (wide Area Networks) are used to connect geographically dispersed organs of the same legal entity (branches of a bank, daughter companies, a sales force). Many LANs are wireless.

The intranet / extranet and wireless LANs will be the winners. They will gradually eliminate both fixed line LANs and WANs. The Internet offers equal, platform-independent, location-independent and time of day - independent access to all the members of an organization.Sophisticated firewall security application protects the privacy and confidentiality of the intranet from all but the most determined and savvy hackers.

The Intranet is an inter-organizational communication network, constructed on the platform of the Internet and which enjoys all its advantages. The extranet is open to clients and suppliers as well.

The company's server can be accessed by anyone authorized, from anywhere, at any time (with local - rather than international - communication costs). The user can leave messages (internal e-mail or v-mail), access information - proprietary or public - from it and to participate in "virtual teamwork" (see next chapter).

By the year 2002, a standard intranet interface will emerge. This will be facilitated by the opening up of the TCP/IP communication architecture and its availability to PCs. A billion USD will go just to finance intranet servers - or, at least, this is the median forecast.

The development of measures to safeguard server routed inter-organizational communication (firewalls) is the solution to one of two obstacles to the institution of the Intranet. The second problem is the limited bandwidth which does not permit the efficient transfer of audio (not to mention video).

It is difficult to conduct video conferencing through the Internet. Even the voices of discussants who use internet phones come out (slightly) distorted.

All this did not prevent 95% of the Fortune 1000 from installing intranet. 82% of the rest intend to install one by the end of this year. Medium to big size American firms have 50-100 intranet terminals per every internet one.

At the end of 1997, there were 10 web servers per every other type of server in organizations. The sale of intranet related software was projected to multiply by 16 (to 8 billion USD) by the year 1999.

One of the greatest advantages of the intranet is the ability to transfer documents between the various parts of an organization. Consider Visa: it pushed 2 million documents per day internally in 1996.

An organization equipped with an intranet can (while protected by firewalls) give its clients or suppliers access to non-classified correspondence. This notion has its charm. Consider a newspaper: it can give access to all the materials which were discarded by the editors. Some news are fit to print - yet are discarded because of space limitations. Still, someone is bound to be interested. It costs the newspaper close to nothing (the material is, normally, already computer-resident) - and it might even generate added circulation and income. It can be even conceived as an "underground, non-commercial, alternative" newspaper for a wholly different readership.

The above is but one example of the possible use of the intranet to communicate with the organization's consumer base.

3. Mail and Chat

The Internet (its e-mail possibilities) is eroding traditional mail. The market share of the post office in conveying messages by regular mail has dwindled from 77% to 62% (1995). E-mail has expanded to capture 36% (up from 19%).

90% of customers with on-line access use e-mail from time to time and 60% work with it regularly. More than 2 billion messages traverse the internet daily.

E-mail applications are available as freeware and are included in all browsers. Thus, the Internet has completely assimilated what used to be a separate service, to the extent that many people make the mistake of thinking that e-mail is a feature of the Internet. Microsoft continues to incorporate previously independent applications in its browsers - a behaviour which led to the 1999 anti-trust lawsuit against it.

The internet will do to phone calls what it has done to mail. Already there are applications (Intel's, Vocaltec's, Net2Phone) which enable the user to conduct a phone conversation through his computer. The voice quality has improved. The discussants can cut into each others words, argue and listen to tonal nuances. Today, the parties (two or more) engaging in the conversation must possess the same software and the same (computer) hardware. In the very near future, computer-to-regular phone applications will eliminate this requirement. And, again, simultaneous multi-modality: the user can talk over the phone, see his party, send e-mail, receive messages and transfer documents - without obstructing the flow of the conversation.

The cost of transferring voice will become so negligible that free voice traffic is conceivable in 3-5 years. Data traffic will overtake voice traffic by a wide margin.

This beats regular phones.

The next phase will probably involve virtual reality. Each of the parties will be represented by an "avatar", a 3-D figurine generated by the application (or the user's likeness mapped into the software and superimposed on the the avatar). These figurines will be multi-dimensional: they will possess their own communication patterns, special habits, history, preferences - in short: their own "personality".

Thus, they will be able to maintain an "identity" and a consistent pattern of communication which they will develop over time.

Such a figure could host a site, accept, welcome and guide visitors, all the time bearing their preferences in its electronic "mind". It could narrate the news, like "Ananova" does. Visiting sites in the future is bound to be a much more pleasant affair.

4. E-cash

In 1996, the four corporate giants (Visa, MasterCard, Netscape and Microsoft) agreed on a standard for effecting secure payments through the Internet: SET. Internet commerce is supposed to mushroom by a factor of 50 to 25 billion USD. Site owners will be able to collect rent from passing visitors - or fees for services provided within the site. Amazon instituted an honour system to collect donations from visitors. Dedicated visitors will not be deterred by such trifles.

5. The Virtual Organization

The Internet allows simultaneous communication between an almost unlimited number of users. This is coupled with the efficient transfer of multimedia (video included) files.

This opens up a vista of mind boggling opportunities which are the real core of the Internet revolution: the virtual collaborative ("Follow the Sun") modes.

Examples:

A group of musicians will be able to compose music or play it - while spatially and temporally separated;

Advertising agencies will be able to co-produce ad campaigns in a real time interactive mode;

Cinema and TV films will be produced from disparate geographical spots through the teamwork of people who never meet, except through the net.

These examples illustrate the concept of the "virtual community". Locations in space and time will no longer hinder a collaboration in a team: be it scientific, artistic, cultural, or for the provision of services (a virtual law firm or accounting office, a virtual consultancy network).

Two on going developments are the virtual mall and the virtual catalogue.

There are well over 300 active virtual malls in the Internet. They were frequented by 32.5 million shoppers, who shopped in them for goods and services in 1998. The intranet can also be thought of as a "virtual organization", or a "virtual business".

The virtual mall is a computer "space" (pages) in the internet, wherein "shops" are located. These shops offer their wares using visual, audio and textual means. The visitor passes a gate into the store and looks through its offering, until he reaches a buying decision. Then he engages in a feedback process: he pays (with a credit card), buys the product and waits for it to arrive by mail. The manufacturers of digital products (intellectual property such as e-books or software) have begun selling their merchandise on-line, as file downloads.

Yet, slow communications and limited bandwidth - constrain the growth potential of this mode of sale. Once solved - intellectual property will be sold directly from the net, on-line. Until such time, the intervention of the Post Office is still required. So, then virtual mall is nothing but a glorified computerized mail catalogue or Buying Channel, the only difference being the exceptionally varied inventory.

Websites which started as "specialty stores" are fast transforming themselves into multi-purpose virtual malls. Amazon.com, for instance, has bought into a virtual pharmacy and into other virtual businesses. It is now selling music, video, electronics and many other products. It started as a bookstore.

This contrasts with a much more creative idea: the virtual catalogue. It is a form of narrowcasting (as opposed to broadcasting): a surgically accurate targeting of potential consumer audiences. Each group of profiled consumers (no matter how small) is fitted with their own - digitally generated - catalogue. This is updated daily: the variety of wares on offer (adjusted to reflect inventory levels, consumer preferences and goods in transit) - and prices (sales, discounts, package deals) change in real time.

The user will enter the site and there delineate his consumption profile and his preferences. A customized catalogue will be immediately generated for him.

From then on, the history of his purchases, preferences and responses to feedback questionnaires will be accumulated and added to a database.

Each catalogue generated for him will come replete with order forms. Once the user concluded his purchases, his profile will be updated.

There is no technological obstacles to implementing this vision today - only administrative and legal ones. Big retail stores are not up to processing the flood of data expected to arrive. They also remain highly sceptical regarding the feasibility of the new medium. And privacy issues prevent data mining or the effective collection and usage of personal data.

The virtual catalogue is a private case of a new internet off-shoot: the "smart (shopping) agents". These are AI applications with "long memories".

They draw detailed profiles of consumers and users and then suggest purchases and refer to the appropriate sites, catalogues, or virtual malls.

They also provide price comparisons and the new generation (NetBot) cannot be blocked or fooled by using differing product categories.

In the future, these agents will refer also to real life retail chains and issue a map of the branch or store closest to an address specified by the user (the default being his residence). This technology can be seen in action in a few music sites on the web and is likely to be dominant with wireless internet appliances. The owner of an internet enabled (third generation) mobile phone is likely to be the target of geographically-specific marketing campaigns, ads and special offers pertaining to his current location (as reported by his GPS - satellite Geographic Positioning System).

6. Internet News

Internet news are advantaged. They can be frequently and dynamically updated (unlike static print news) and be always accessible (similar to print news), immediate and fresh.

The future will witness a form of interactive news. A special "corner" in the site will be open to updates posted by the public (the equivalent of press releases). This will provide readers with a glimpse into the making of the news, the raw material news are made of. The same technology will be applied to interactive TVs. Content will be downloaded from the internet and be displayed as an overlay on the TV screen or in a square in a special location. The contents downloaded will be directly connected to the TV programming. Thus, the biography and track record of a football player will be displayed during a football match and the history of a country when it gets news coveage.

Terra Internetica - Internet, an Unknown Continent

This is an unconventional way to look at the Internet. Laymen and experts alike talk about "sites" and "advertising space". Yet, the Internet was never compared to a new continent whose surface is infinite.

The Internet will have its own real estate developers and construction companies. The real life equivalents derive their profits from the scarcity of the resource that they exploit - the Internet counterparts will derive their profits from the tenants (the content).

Two examples:

A few companies bought "Internet Space" (pages, domain names, portals), developed it and make commercial use of it by:

renting it out

constructing infrastructure and selling it

providing an intelligent gateway, entry point to the rest of the internet

or selling advertising space which subsidizes the tenants (Yahoo!-Geocities, Tripod and others).

Cybersquatting (purchasing specific domain names identical to brand names in the "real" world) and then selling the domain name to an interested party

Internet Space can be easily purchased or created. The investment is low and getting lower with the introduction of competition in the field of domain registration services and the increase in the number of top domains.

Then, infrastructure can be erected - for a shopping mall, for free home pages, for a portal, or for another purpose. It is precisely this infrastructure that the developer can later sell, lease, franchise, or rent out.

At the beginning, only members of the fringes and the avant-garde (inventors, risk assuming entrepreneurs, gamblers) invest in a new invention. The invention of a new communications technology is mostly accompanied by devastating silence.

No one knows to say what are the optimal uses of the invention (in other words, what is its future). Many - mostly members of the scientific and business elites - argue that there is no real need for the invention and that it substitutes a new and untried way for old and tried modes of doing the same thing (so why assume the risk?)

These criticisms are usually founded:

To start with, there is, indeed, no need for the new medium. A new medium invents itself - and the need for it. It also generates its own market to satisfy this newly found need.

Two prime examples are the personal computer and the compact disc.

When the PC was invented, its uses were completely unclear. Its performance was lacking, its abilities limited, it was horribly user unfriendly.

It suffered from faulty design, absent user comfort and ease of use and required considerable professional knowledge to operate. The worst part was that this knowledge was unique to the new invention (not portable).

It reduced labour mobility and limited one's professional horizons. There were many gripes among those assigned to tame the new beast.

The PC was thought of, at the beginning, as a sophisticated gaming machine, an electronic baby-sitter. As the presence of a keyboard was detected and as the professional horizon cleared it was thought of in terms of a glorified typewriter or spreadsheet. It was used mainly as a word processor (and its existence justified solely on these grounds). The spreadsheet was the first real application and it demonstrated the advantages inherent to this new machine (mainly flexibility and speed). Still, it was more (speed) of the same. A quicker ruler or pen and paper. What was the difference between this and a hand held calculator (some of them already had computing, memory and programming features)?

The PC was recognized as a medium only 30 years after it was invented with the introduction of multimedia software. All this time, the computer continued to spin off markets and secondary markets, needs and professional specialities. The talk as always was centred on how to improve on existing markets and solutions.

The Internet is the computer's first important breakthrough. Hitherto the computer was only quantitatively different - the multimedia and the Internet have made it qualitatively superior, actually, sui generis, unique.

This, precisely, is the ghost haunting the Internet:

It has been invented, is maintained and is operated by computer professionals. For decades these people have been conditioned to think in Olympic terms: more, stronger, higher. Not: new, unprecedented, non-existent. To improve - not to invent. They stumbled across the Internet - it invented itself despite its own creators.

Computer professionals (hardware and software experts alike) - are linear thinkers. The Internet is non linear and modular.

It is still the age of hackers. There is still a lot to be done in improving technological prowess and powers. But their control of the contents is waning and they are being gradually replaced by communicators, creative people, advertising executives, psychologists and the totally unpredictable masses who flock to flaunt their home pages.

These all are attuned to the user, his mental needs and his information and entertainment preferences.

The compact disc is a different tale. It was intentionally invented to improve upon an existing technology (basically, Edison's Gramophone). Market-wise, this was a major gamble: the improvement was, at first, debatable (many said that the sound quality of the first generation of compact discs was inferior to that of its contemporaneous record players). Consumers had to be convinced to change both software and hardware and to dish out thousands of dollars just to listen to what the manufacturers claimed was better quality Bach. A better argument was the longer life of the software (though contrasted with the limited life expectancy of the consumer, some of the first sales pitches sounded absolutely morbid).

The computer suffered from unclear positioning. The compact disc was very clear as to its main functions - but had a rough time convincing the consumers.

Every medium is first controlled by the technical people. Gutenberg was a printer - not a publisher. Yet, he is the world's most famous publisher. The technical cadre is joined by dubious or small-scale entrepreneurs and, together, they establish ventures with no clear vision, market-oriented thinking, or orderly plan of action. The legislator is also dumbfounded and does not grasp what is happening - thus, there is no legislation to regulate the use of the medium. Witness the initial confusion concerning copyrighted software and the copyrights of ROM embedded software. Abuse or under-utilization of resources grow. Recall the sale of radio frequencies to the first cellular phone operators in the West - a situation which repeats itself in Eastern and Central Europe nowadays.

But then more complex transactions - exactly as in real estate in "real life" - begin to emerge.

This distinction is important. While in real life it is possible to sell an undeveloped plot of land - no one will buy "pages". The supply of these is unlimited - their scarcity (and, therefore, their virtual price) is zero.

The second example involves the utiliz



iAutoblog the premier autoblogger software

Day Trading Computer

Computers are so cheap nowadays that almost any new computer will be sufficient for day trading. Below I will highlight the basic components that you need for your day trading system: hardware, software, Internet connection, and system protection.

Hardware for day trading

This is a typical trading computer setup with two monitors connected to the same computer. This allows the day trader to expand the trading screen across both monitors as if was one big rectangular monitor. Thus, more graphs and other market data can be observed by the day trader at the same time. Even though I do not have a Ph.D. in computers, I have used and helped people set up computers for day trading, and know other people that use computers for the same thing. Below I have provided the minimum general specifications for day traders followed by the preferred specifications (in parentheses):

Pentium III 700 MHz or higher (Pentium 4 best)

512 MB RAM or higher (1 GB or higher best)

Windows XP

One 21-inch CRT monitor (Two 19-inch LCD [flat screen] or greater preferred)

Please note that you will need two video cards (one per monitor) if you want to use two monitors, or a multi-head video card (like ATI, Matrox or Apian) that allows you to connect more than one monitor to it (best option). Unless you are a computer expert, try to order the system exactly like you want it (turnkey) from your chosen computer vendor. If you decide to use the trading robot, you don't need many of the hardware and software options described on this page.

Day trading software

Quality software is the most essential element in a robust day trading system. I believe that RealTick is the best software currently in the market today for trading stocks. To trade currencies, such a sophisticated software is not necessary. Every serious day trader uses special software for day trading. This software is installed on the trading computer mentioned above and through it the day trader analyzes stocks or currencies and places orders to buy and sell. Since the trading software is the most important component of a complete day trading system, it is very important that the software be well-known and widely used by traders. The software for trading stocks should have the following components:

Level II (a list of all the buy and sell orders in the market)

Time and Sales (list of all transactions)

Real-time streaming quotes and charts (constantly updated with live market data)

Portfolio tracker

Real-time news

Order entry built into the software

The best software in this category that I know is called RealTick. It is given to you by your broker (direct-access broker that uses the RealTick platform) when you open a brokerage account. If you want to work day trading currencies instead (forex trading), you don't need all the above features. Currency trading software comes in either a standalone or JAVA version. The one I use is pretty easy to learn. To test drive a trading simulator for 30 days of the system I use, click on this link.

Internet connection

A fast Internet connection is a must for serious day traders. Nowadays, DSL and Cable modem service are very affordable. Even if you buy the best computer that is currently available, without a fast Internet connection you cannot receive all of the streaming, real-time information (quotes, charts, transaction information, etc.) provided by your day trading software. For that reason, a 56-Kbps regular phone line modem is not an ideal primary connection to the Internet (except in rare cases). Suggestions (if available):

DSL (Digital Subscriber Line) service from your phone company, or

Cable Modem service from your cable company

I have used DSL for a few years already and think it's great (right now I use BellSouth DSL). DSL is much faster than a 56-K modem. Even though I have no experience with the cable modem service, I know that many traders use it and it is also extremely fast. After you call your DSL company and order the DSL service (about $50 per month), it usually takes the company about 2 weeks to send you the modem and software you need to use the service. If you do not have DSL or Cable Modem service in your area, one option is always to trade in the broker's office, with one of the high-speed computers that are already set up there.

System protection

An antivirus program will prevent destructive computer viruses from running on your trading system. Antivirus Program: All you have to do is listen to the news today and you will probably hear a story about a new computer virus program that is causing havoc around the world. This is common practice nowadays. Many viruses have caused quite a stir in the last few years, and have damaged a large number of computers throughout the world. A computer usually gets a virus when a file infected with the virus is opened by the user. Most people obtain these files as email attachments. If your computer files are destroyed by a virus, you won't be able to trade. To avoid this, you should use an antivirus program like Norton Antivirus or McAfee VirusScan. There are some free virus scanners out there, but you can use them at your own risk. Most computers already come from the vendor with one of these two programs installed.

A firewall will block unauthorized access to your trading computer from the outside world. Firewall: If you use a high-speed Internet connection (like DSL or Cable Modem) you become vulnerable to ""hacking."" Hacking is when a person (hacker) breaks into (hacks) your computer from another computer. The hacker then takes control of the computer and can simply spy on you or delete the entire contents of your trading system. This can be a very serious setback for a trader. To prevent this from happening, a trader can use a ""firewall."" A firewall blocks unauthorized access to your trading computer from the outside world. A firewall can consist of additional hardware and/or software installed on the computer system. The easiest solution to set up is a software firewall. There are tons of different firewall software vendors in the market and many companies that provide free versions of their software. To learn more about firewalls, you can go to the Shield's Up section of St eve Gibson's website at

A good back-system is essential to complete the day trading computer. A 56-K modem can serve as a back-up for the high-speed Internet connection and a UPS (or worse case, a surge suppressor) can be used to provide back-up power during blackouts and suppression of voltage surges. Back-up System: In a perfect world we wouldn't need a back up system, but we don't live in a perfect world. The most important forms of back-up for trading are:

56-Kbps Modem in case your DSL or Cable Modem service is disrupted

Uninterruptible Power Supply (UPS) for a power failures and surges

Even though I've had very few cases when the DSL stops working during trading, it has happened. The same thing can happen to the Cable Modem service. When your high-speed service goes out, you can simply connect via your regular 56-Kbps modem via your phone line. Even though you cannot feed as much information through the 56-K service as you can with a high-speed connection, it still allows you to place trades and view some basic trading information. With RealTick, you can design a simpler configuration in the software for cases like these (the currency trading software works perfectly even with a 56-K modem).

A UPS allows your trading computer to run on back-up power when there is a blackout. Even though you cannot run on back-up power indefinitely (based on the limitations of the UPS), it does allow you to close any positions that you do not want to leave open as well as save anything that you were working on. A UPS will also provide protection from voltage surges that can damage your trading computer. In the United States we have the advantage that the power is pretty reliable, so if you don't want to buy a UPS right away when you start trading that's OK, but you should at least use a surge suppressor ($20 to $50) to protect your trading computer from sharp fluctuations in power which can easily damage or shorten the life of your trading equipment.





iAutoblog the premier autoblogger software
Free Flys - FREE Coupons
My Ping in TotalPing.com