Monday, April 23, 2007

Article for International Internet Issues

How to protect your Trademark from Cybersquatters
Memo #12/13-International Internet Issues



This Memo will focus on the issue of Cybersquatting on the Internet, and the challenge for trademark owners in attempting to stop a cybersquatter from misappropriating a trademark. What is a cybersquatter? In a simplified nutshell, a Cybersquatter obtains an Internet domain name that uses language that is very similar to an existing trademark. Or the squatter looks for a new company that has failed to register the most likely domain names that one would use to market your brand. In many cases, the domain names obtained are misspellings or slightly modified from the trademark holder's website. Here is an example (which is not actual, but just a intellectual thought). if the main website for Coca-Cola is www.coke.com or www.coca-cola.com, a cybersquatter would attempt to register www.koke.com or www.koka-kola.com or other variations. Then the cybersquatter would then populate the website with ads and wait for the consumer/user to enter the misspelled website and visit the site (driving up viewership numbers and earning more money from ad sales). Of note, there is also a concern for malicious use, such as posting a virus or malicious code on the squatted website, or directing consumers to a competing product (sending a user looking for Coca-Cola to a Pepsi website). Also there are legitimate uses for a misspelling, such as a personal name (morgan&stanley.com instead of the site for the MorganStanley company, morganstanley.com) or selling a legitimate product. I will ignore this issue and simply focus on squatters looking to drive-up ad revenues.
Why is this an issue? Well companies look to protect their trademarks and to protect their market by directing customers to their main website. Trademark holders want to own misspelled or one-off websites so as to direct the use to the main website. International users, where vagaries in language could send a user to a misspelled site (of course the same problem happens within the U.S. of course).
So why is cybersquatting an international issue? Well, because many of the Domain Name registrations occur outside of the United States through procedures established by ICANN. The Patent and Trademark Office for the United States has 3,000,000 registered trademarks. But there are 33,000,000 registered domain names throughout the world. There is a likelihood that the trademark holders will face a cybersquatting issue in an attempt to protect their trademarks. And if a cybersquatter is outside of the U.S., American law and courts are helpless to stop them. That is why ICANN, through the World Intellectual Property Organization ("WIPO") have established procedures and arbitration panels to help fight against cybersquatting. Domain Name registration services have been required by ICANN, as the root authority for the DNS service, to follow the arbitration panel decisions.
Thus, international cybersquatting is one example of establishing a quasi-judicial solution to an International Internet issue.

Monday, April 9, 2007

Articles Associated with Memo #11

Appleinsider.com response by Norway Ombudsman to Apple-EMI announcement.

Appleinsider.com original consumer complaints from Norway.
Memo #11-Apple and Norway: an antitrust issue?
Many of you may have followed the recent EMI-Apple agreement to being selling music through the iTunes music store that is unencumbered by Digital Rights Management ("DRM") restrictions. My focus is on the response made in Norway to this new deal. Why Norway, you may ask? Well in the last year, the Norwegian Consumer Council lodged a formal complaint against Apple with the Norway Consumer Ombudsman. The charges were related to Apple's use of FairPlay DRM to protect music files downloaded from the Norway version of iTunes (Apple has long differentiated the iTunes Stores based on country of origin of the purchaser--the reasons and capabilities to do so is beyond the scope of this entry). The charges stated the FairPlay system violated Consumer Protection laws for Norway. n.1 By imposing FairPlay on the music bought through the iTunes Store, and technologically limiting playback of FairPlay-wrapped music files to the iTunes program and the iPod, the consumer is concerned with anti-competitive tying of the two products. Other Charges include the limited liability Apple would face in the country for security holes created by the FairPlay system and that the End User License Agreements ("EULA") imposed by Apple violate Norwegian law because a consumer in Norway cannot be forced to accept the choice of law provisions contained in the EULA. n.2
This is an interesting series of arguments because of a single, fundamental issue. What happens if Apple simply pulls up stakes in Norway and simply refuses to deal with Norwegian consumers? Nothing that I have been able to find would preclude Apple from simply dropping any and all support for the Norway version of iTunes, either in a technological or legal sense. There of course would be ramifications from such a move. Boycotts of other Apple products sold in Norway; outright rejection of the iPod in the country; support found in other European countries leading to a greater boycott of Apple; and of course, the public relations backlash related to the move. However, the major issue remains. The Norwegian consumer would be harmed by losing full access to one of the few competitors in the legal music download market. This concern is found in the fact that the Ombudsman has established a September deadline for Apple to respond to these charges with ways of improving the store for fairness. It is legitimate to think that the Ombudsman is hoping for additional pressure to be placed on Apple (such as the European Union antitrust forces going after Apple on their own) which will prevent Apple from simply withdrawing from the Market. For evidence, see the praise given by the Ombudsman to Apple and EMI for taking this first step towards removing DRM, for the ombudsman fails to mention concerns that the music will be sold as AAC files (which only a few players support as of now) and the higher prices being charged (which also provides higher bitrates). These issues were not mentioned by the ombudsman.
Thus, it is clear that attempting to enforce a stronger antitrust/consumer protection then what is found in the home country of the company raises serious questions about the unintended consequences. If a firm does not want to be subject to the antitrust laws of a country, and can economically afford to withdraw, then there is nothing to prevent this action, resulting in greater harm to the consumers the laws were designed to protect.


1. AppleInsider.com, Norwegian official applauds Apple-EMI deal, asks others to follow, April 2, 2007, http://www.appleinsider.com/article.php?id=2625
2. AppleInsider.com, Norwegian consumer group opposes iTunes TOS, June 7, 2006, http://www.appleinsider.com/article.php?id=1792

Thursday, March 22, 2007

Memo #10—Google Docs Experience

For the best explanation of my experience and opinions as to the Google Docs program, I am currently typing this memo in Microsoft Word. I will then copy and paste the memo into Google Docs for publication to the Standards blog. So, in a nutshell, I feel that the limitations and compatibility issues outweigh the benefits of Google Docs.
I understand and appreciate the benefits of Google Docs. The first is that it is a thin client. By running the program on the Internet, no hard drive space on the computer is used and little RAM is occupied by the program. I run Mac OS X, and since I am currently using Word, I will write about the amount of processor capacity is being used, the amount of RAM occupied by the typing of this memo and the amount of hard drive space Microsoft Word occupies on the hard drive. I performed a clean login, which prevents startup programs from running in Mac OS X. Opening Activity Monitor, the program tells me that Microsoft Word is currently using 3.40% of the processor capability of my 2 GHz Intel Core Duo processor. Currently, Word is occupying 75.06 MB of RAM. Finally, Word as an application occupies 19.5 MB of ard drive real estate, but the entire Microsoft Office suite of applications occupies 415 MB of space. True, these numbers are relatively small, but when a computer is maxed out, all of the extra space, processor capacity and RAM space is needed and thus an online suite of office applications is attractive. The second advantage is the online storage of files, automatically backing up all files. If your hard drive fails, your Office files are lost unless you personally back-up your files. These are the two main advantages for me, but these two are entirely outweighed by the limitations of the program.
The first limitation for me is compatibility. Inexplicably, to me, Google Docs does not support Safari for Mac. While I am able to run the program in Firefox, Safari is my default browser, and my personal choice of browser. Firefox for Mac OS X takes a long time to initalize, longer then Word, and thus I am less inclined to use Google Docs through the Firefox browser. The Second limitation is the inability to Footnote or to import Footnotes into Google Docs. As a law student, I use footnotes all the time. Because this significant feature is lacking, I am less inclined to use Google Docs. Finally, I am using my computer to compose and edit documents regularly in places where I lack Internet access, such as the bus, car and airplane. If I cannot access the documents, nor save them, I am disinclined to continue using the program.
Thus, these three limitations for me far outweigh the benefits of using a thin, online word processor client such as Google Docs.

Monday, March 19, 2007

Memo #9-Second Life



My Second Life experience has been an utter failure, up until this week. Attending classes had been difficult, finding more than one person per session willing to talk had been difficult, and for the most part, I was surrounded by fellow "newbies," who knew even less about the system and protocols then I did. That changed over the weekend. I was finally able to attend a class, on the "island" of twilight. The following is my experience and my observations.

Having found the class through the Secondlife.com/events page, I clicked on the appropriate link and was transported to the right location. Only I did not know it at the time. The time for the class was listed as 11:00 AM. Well, what time zone (at this point I was still unaware that Second Life runs entirely on Pacific Standard Time, an ignorance that probably caused me to miss other classes)? I logged in at 9:00 (thinking the class was 11:00 EST) and walked around. I ended up in another portion of the island, owned and run by a character named Thoth. I asked him about the class and he helpfully directed me back off his property, which in the end I think was the actual goal. I was directed to a sign listing the class and the time, and was "teleported" back to where I had started.

Arriving back at the location, there was still no one in the area. After a couple of minutes of wandering around, a "Eccentric Person," named Zenos, literally rode in on a horse to help. Yes, a giant black horse. After speaking to him for a few minutes, I learned the actual time of the class and that he and a friend would be teaching the class.

Arriving back at the appropriate site, and finally at the appropriate time, I met the instructor, "Brenda Goodliffe," who provided a free supplies box (landmarks, a t-shirt, notes and so forth). The format of this class was a question and answer period for newbies. At the beginning of the class, I was the only person willing to ask questions. I did not focus on the basics of Second Life, as I was becoming proficient during my aimless wandering at using the interface; instead I was looking to get answers to some of the questions that had been bugging me about the system.
My primary concern was the issue of privacy. I had accidentally wandered onto the part of the island owned by Thoth. As I stated earlier, his primary concern was finding out why I was on his "land" and my intentions. When I informed him that I was a newbie, he essentially escorted me back into the "town" and his property. I found this fascinating. Also fueling my questions was the presence of blocked areas throughout the world, which essentially wrapped a portion of an island with police tape denying access unless I was part of a group. The answers I received fascinated me. Brenda spoke of how the "residents" of these islands, those who had actually purchased land and built structures, considered these places as their home, and were thus protective of the privacy and from trespassers. An online community, where there is complete freedom of movement and the ability to fly and teleport seems to be in opposite with this goal. Brenda also spoke of how residents restricted access so that they can change "outfits" in privacy. Really? Changing outfits means going into your inventory and clicking on file and suddenly your avatar looks different. The clothes never came off or went on, so why would one need actual privacy? The answer lies in the hyper-sexual nature of the Second Life Community. I was informed of an island of "Goreans" (This memo is sounding more like a fantasy novel at this point), where the women are sexual slaves, complete with S&M outfits and leashes. Zenos, my friend with the horse, sent me a "snapshot" he took while on the "gorean" island. Needless to say, the photo was quite graphic. In fact, one of the most profitable jobs on the island was to serve as an "escort," which is exactly as it sounds.

My second set of observations is based on these types of jobs. At the beginning of Second Life, residents were "paid" in the online currency, Linden dollars, for building new structures or for teaching classes. Now, there are very few paid positions. From what I learned, Linden dollars are now almost entirely derived from REAL World dollars, and not from economic activities within the Second Life world. Brenda agreed that this was the most likely reason for the reduction in actual classes being available, and she lamented on the increasing commercial nature of the world (whole islands are dedicated to commercial enterprises, including one advertising the L-Word show on Showtime).

At this point, the class delved into more mundane matters such as how to build and manipulate objects in the world. I had received the answers I wanted and left Second Life. I'm glad I learned what I learned form this class. It really has jaded my opinion about the Second Life community.

Sunday, March 11, 2007

Memo #8-Podcasting

My experience with Podcasting is different then the norm. I have yet to find a podcast that I have not enjoyed and thus this memo will be directed to towards the qualities, in my opinion, that I an effective podcast should have. For the large part, the best podcasts in my opinion are those that are produced by professional entertainment companies. For example, my personal interests in podcasts relate to sports. The three podcasts that I listen to are the free versions of The Best and Mike and Mike in the Morning and The Big Show with Dan Patrick and Keith Olbermann, both of which are Podcasts derived from the ESPN radio shows of the name, as well as the podcast version of Pardon The Interruption. The Pardon the Interruption podcast is the audio track of the ESPN television show of the same name. I listen to these podcasts because with my busy schedule, I have no time to listen to the free, over-the-air versions of the shows. The other reason is that the shows are presented largely commercial-free, an option unavailable when listening to the shows live.

The fact that the shows are commercial free is the primary reason I choose to listen to these podcasts. As an early adopter of TiVo-DVR technology, I have come to appreciate any new way to consume content without commercial interruption. Podcasts are generally inexpensive in terms of production and distribution. In the case of the ESPN podcasts, which are repackaging of content already distributed through radio and TV, the content has already been paid for through the traditional media model. this allows the podcasts to be distributed with little or no commercials. Other then The Big Show, which has a 10 second commercial embedded at the 8 minute mark, these podcasts have commercials at the end. The reason for this is obvious. The content is already paid for, and the podcast is an alternative means of developing an audience. Presenting commercial-free content is a way to draw in this secondary audience.

The second reason for choosing these podcasts is that, other then the Pardon the Interruption podcast, these are half-hour excerpts of three or four-hour radio programs. The editors pick and choose the segments from the day and re-cut them into a shorter, manageable portions before uploading to the Internet. Length is the key. A half-hour podcast works as an appropriate length. I listen to the podcasts during the bus-ride to and from Denver. Podcasts timing around 15 minutes are too short. The content is over before the trip. But podcasts that last longer then a half-hour mean that I will often be still listening to the podcasts after leaving the bus.

The final reason for choosing these podcasts is the professional nature of the podcast. I can trust ESPN to provide high-quality, entertaining content because the source is consistent and professional. Sports content depends on reporters and analysts that are connected to the teams, are able to interview players, and have a strong understanding of the sport. A professional source is the best way to deliver such content. But content is not the only aspect of a professional product. ESPN also ensures that the audio quality will be consistent, the podcasts will be uploaded and available and that they will provide metadata that allows easy searching. In choosing podcasts, I am looking for professional quality in terms of content and quality, half-hour programs, and content delivered with little or no commercials.

Other podcasts that have met this criteria are those from NPR: Fresh Air and Chicago Public Radio: This American Life. Both of these podcasts deliver high-quality content and audio quality and without commercials as they consist of repackaged content. The only reason I do not listen to these podcasts as regularly as the ESPN podcasts is the length. They are longer then a half-hour, which is outside my normal criteria.

Monday, March 5, 2007

Wiki Article

XCP Protection

http://en.wikipedia.org/wiki/XCP
Memo #7-Wikipedia is not a credible source (Depending on how you use it).
Don't get me wrong, I love Wikipedia. Wikipedia will provide quick-hit answers to questions in the legal, technical and pop-culture fields. The answers tend to be correct and the sourcing at the end of each article can be the start of fruitful research project. If that is how one plans to use Wikipedia, the basis for a larger research project and the starting point for relevant sources, then yes, Wikipedia is a credible source. But if a person wishes to cite to a Wikipedia article for support in a paper or project, then no, it is not a credible source. As it is in most questions I have approahced in law school, the answer depends...

Why is it not a credible source for citation in a paper or a research project? The answer lies in the very nature of the program. It is open to everyone. In July of 2006, New Yorker writer Stacey Schiff, examined the Wikipedia phenomonon. Her article forms the basis for this opinion. The first amazing quality of Wikipedia, which in turn is the first reason why it is not credible, is that there are five employees besides the founder, Jimmy Wales. Five employees; that is it. All other content and editorial control comes from the users. User creation and editing of content allows the site to run with just five employees, but it also invites unscrupulous actions. According to Schiff, U.S. senators have been caught massasging their own entries in order to santize voting records, refine their stance on issues, or to distance themselves from an unpopular president. In fact, the entire House of Representatives have been banned, at different points, from posting for the same reasons. If there are major question marks regarding the wikipedia entries for Senators and U.S. Represenataives, then it is hard to implicitly trust the information found in other entries.

The second reason stemming from the open nature of the site is that changes are not automatically vetted by the editorial “staff”, made up of administration level users who can enforce the site’s standards. Schiff points out an article on the 2006 Israeli-Hezbollah that has been edited over 4,000 times and draws the interest of writers and editors. Thus, one could call this article a credible source because it is consistently being vetted. However, one of the attractions of Wikipedia is that there are articles on just about anything that strikes your interest. These obscure entries can be edited multiple times by people without ever being checked to ensure they meet the standards of the site. If the article is inaccurate, and remains so for an extended period of time, then it cannot be a credible source.

Finally, a recent development in the wake of the Schiff article has forced me to conclude that the site is not an appropriate source. In preparing for the story, Schiff was contacted by a Wiki Administrator, identified only by the person’s screen name. In conversations, Schiff was led to believe that the administrator was a college professor with a Ph.D. in theology. In fact, the person was a 24-year old with no connection to teaching and with no advance degrees. If an administrator is lying about his identity, then the credibility of the source must be questioned.

So, if you are looking to use Wikipedia as a initial source for research, following the included citations out into the larger Internet, then use Wikipedia. I just can’t see anyone actually citing it as a stand-alone, legitimate site.

Monday, February 26, 2007

Memo # 6: Foreign Investment in Telecommunications

The question asked was to address an issue with investment in telecommunications in a foreign country. For this memo, I wish to examine a related topic. In 2006, a paper was published for the Southern Association for Information Systems Conference, entitled Telecommunication Investment in Economically Developing Countries. The short paper addressed the annual budget expenditure of economically developing countries ("EDC") into the telecommunications structure of the country. The basis of the study is the Annual Telecommunications Investment ("ATI"), defined by the authors as "the annual expenditure associated with acquiring ownership of property and plant used for telecommunication services." Id. at 30. This expenditure was then compared to the nations' Gross Domestic Product to determine the percentage of GDP invested in telecommunications and whether this investment has spurred overall growth in the GDP. The results are interesting and provide the evidence that EDC's need greater foreign investment in their telecommunications industries. As many of these nations still have a nationalized, government funded telecommunications industry, the spending is often limited by government budgets and political will. There are a few major conclusions to draw from the research.
Many of the EDC's profiled by the study, nations in Africa, Eastern Europe, Central America and Asia, are spending no more then 2.5% of the annual GDP on ATI. That is in line with many of the industrialized nations. The United States is the amongst the lowest spenders, at .5% of the nation's GDP. However, the United States invested $34 billion in 2002 into the telecommunications industry. Industrialized nations have the resources to expend the billions needed to grow and maintain the telecommunications network. For a better comparison, look at the data from Gambia and the Czech Republic. In 2002, the Czech Republic, a relatively strong industrialized economy, spent 1.7% of the GDP on ATI. Gambia, a decidedly developing nation, invested 2% of the GDP on ATI. But here is the difference; the actual amount spent by the Czech Republic was $810 million while Gambia's actual expenditure was $8 million. The equality of percentage of GDP when viewed in terms of actual dollars is staggering.
The ramifications are huge. The authors state that the investment required to raise a nations telephone infrastructure (again, simply telephone infrastructure) from 1 phone for 300 inhabitants to 1 phone for every 100 inhabitants is $8 billion. Staggering; to take a EDC to the basic telephone infrastructure of 1 phone (AGAIN JUST PHONES) for every 100 inhabitants requires a 10,000 times increase in the investment a reasonably developed nation, like the Czech Republic, provides in a year. At current investment levels, that is a million times increase for Gambia. EDCs cannot make these telecommunications investments on their own, using the resources within the nation.
For the EDCs of the world to reach even a remedial level of telecommunications infrastructure, either foreign investment is required to enable the growth, or else the nation must be willing to take on huge amounts of debt to grow the network. Two nations, Azerbaijan and Gabon attempted the second route. In 2002, Gabon spend 229% of the GDP on ATI while in 2000, Azerbaijan invested 436% of its GDP on ATI. But it is unclear whether these debt-fueled investments will actually grow the national economy. A better route, a safer route for a country, would be to allow foreign investment in the telecommunications network, enabling a nation to provide necessary service without encumbering the economic hardship of debt.

Article

Role of Telecommunications Investment in an Economically Developing Country (EDC). Does the increase in telecommunications investment lead to a general advance in the economic standing of the country? See below and my next post.

http://sais.aisnet.org/2006/Negash-SAIS2006-papera.pdf

Sunday, February 18, 2007

Memo #5-Public Service Announcement on Funding 9-1-1

FROM: THE NATIONAL EMERGENCY NUMBER ASSOCIATION: NEXT-GENERATION EMERGENCY 9-1-1 PARTNERSHIP

Change is good, but the process of change means tough choices as to deal with the effects of change.

For years, the Emergency 9-1-1 service has been funded by collecting subscriber fees for 9-1-1 service, assessed on all wireline and wireless phone bills and are collected by telecommunications providers. The telecommunications providers then remitted these fees to the local governments to ensure community-wide access to the Emergency 9-1-1 service. This blanket fee ensured that the Emergency 9-1-1 service was paid by, and available to, everyone in the community. The Emergency 9-1-1 service has saved countless lives by connecting the community to the emergency services they need. But the current technological and economic revolutions are threatening to erode the funding for this vital public service. Traditional telephone service is being challenged by the cellular phone and the Internet phone. These challenges to the traditional system are resulting in lower prices and better services for those who choose them; a choice that does nothing but benefit society.

However, a difficult choice must be made. The move away from the traditional sources of funding of Emergency 9-1-1 threatens to erode the very service that has saved and protected the society at large. The other issue is that the providers of Internet telephone service lack the ability to connect to the existing Emergency 9-1-1 service, the willingness to provide such connection, and no method to help provide the funding. As a result, Emergency 9-1-1 is threatened.

However, there are tough choices that can be made that would alleviate these concerns. First, write your congressman and senator and put pressure on them to actually appropriate the $25 Million dollars in Emergency 9-1-1 funding that was authorized in 2005, allowing these public funds to be released to the states and local governments. The release of this money will pay for necessary upgrades to the Emergency 9-1-1 system to ensure that the existing system will be work in conjunction with the new telecommunication technologies.

This release is a short term solution. For the long-term, the local consumers of Emergency 9-1-1 service need to provide direct, reliable sources of funding that applies to all members of the community. As such, please encourage your local city councils, county commissioners and public utility commissioners to encourage the adoption of the small surcharge on access infrastructure providers proposal advocated for by the NENA. This small surcharge, which is lower then the surcharge imposed on the telephone providers in the past, will apply to wireless, wireline, and Internet telephone providers by imposing the small surcharge on the access providers. Businesses, homes and other entities all need an access infrastructure provider to interconnect, and this fee proposal ensures that everyone in the community is providing for Emergency 9-1-1 services. By spreading the cost to all connections, the NENA proposal will keep the individual cost low, lower then what has traditionally been paid. Finally, this fee will ensure that Emergency 9-1-1 services will be paid for and responsive to the local developments.

The NENA knows that imposing a fee on every connection is a tough pill to swallow, but it is a decision that is necessary to ensure Emergency 9-1-1 service will be available to all who need it. Your life, the life of those you care about may depend on it.

Sunday, February 11, 2007

Memo #4: Effect of the ITU-T on Mozambique

Based on an article Mozambique Event Examines Standardization Issues from an African Perspective, ITU-T's Newslog, available at http://www.itu.int/ITU-T/newslog/Mozambique+Event+Examines+Standardization+Issues+From+An+African+Perspective.aspx (October 2, 2006), the ITU-T and other multinational telecommunications unions are simply at the stage of informing the telecommunication regulators of Mozambique about the structure of the ITU-T and what can be provided at this point. The ITU-T, the African Advanced Level Telecommunications Institute (AFRALTI), and the ITU’s Center of Excellence conducted a three-day workshop in late October 2006 for the telecommunications regulators of the country to expose these regulators to the new standards available, in attempt to encourage the adoption of these standards to improve the telecommunications industry in the country, as well as increasing the interconnection of the county to the larger world community. Of the standards discussed at the event, the two that received particular focus were NGN (Next Generation Networking) and VoIP (Voice-over Internet Protocol).
For an initial matter, "The [ITU] Worldwide Centre of Excellence (CoE) Network consists of regional mechanisms aimed at strengthening the capacity within each region in order to develop high-level know-how and expertise in telecommunication policy, regulatory issues, corporate management and advanced telecommunication technology." See http://www.itu.int/ITU-D/hrd/coe/. The AFRALTI is one of two African Centers of Excellence focusing on increasing human resources training and other network training for the developing English speaking countries of Africa, with another Center of Excellence focusing on the French speaking nations. See http://www.afralti.org/coe.html.
The ITU-T defines NGN as follows: "A Next Generation Network (NGN) is a packet-based network able to provide services including Telecommunication Services and able to make use of multiple broadband, QoS [quality of service]-enabled transport technologies and in which service-related functions are independent from underlying transport-related technologies. It offers unrestricted access by users to different service providers. It supports generalized mobility which will allow consistent and ubiquitous provision of services to users." NGN Working Definition, ITU-T, available at http://www.itu.int/ITU-T/studygroups/com13/ngn2004/working_definition.html. The primary feature of the NGN is the fusing of older, separate and non-interconnected networks into one IP based network. Such a network would enable the expansion of VoIP services across this network, allowing developing countries to interconnect with the rest of the world.
Obviously, a concern of regulators in developing countries, and an issue that was raised at the Mozambique conference, is the security of the network. Interconnecting the the greater world on an international standard leads to the possibility that the national network will become targets of denial of service attacks or other hacking tools developed for these standards. Other security concerns would be the ability to use the network for governmental communications in cases of emergencies, or the nations individual standards on privacy. One feature of the NGN, again according to the ITU-T Working Definition, is that the NGN will be compliant with the Regulatory concerns of the nation, suggesting that the NGN is highly flexible.
Based on the information above, the ITU-T is seeking to open dialogues with the telecommunications regulators to expose these developing nations to the new standards and network developments in attempt to reconfigure these national networks so they will be interoperable with the larger world.

Monday, February 5, 2007

Memo #3: Video Game Wars

Ten years is a long time in the Video game industry. Ten years covers what is considered two potential product cycles, with a potential release of two new iterations of the Playstation, Wii and XBox. Or there could be a new player in the market with one of these three systems vanishing form thought. Ten years ago, one of the major players in the industry was Sega. The company had introduced in the late '80's and 90's such gaming consoles as the Genesis, the portable Game Gear and the Sega Dreamcast, but today the company is a shadow of its former self. Even ten years ago, the prospect of a company like Microsoft being a dominant player in the video game console world was not even considered. The only constant in the video game industry is the inclusion of Nintendo amongst the most important and influential leaders in the development of consoles and content. So what does this mean for the next ten years in the video game industry?
One could be tempted to say that the console, with its proprietary content, production and disks, will vanish in a move towards placing all games on the Internet. I am personally doubtful of this development. Today, two of the three major console systems have become the showcase for a new format war, HD-DVD versus Blu-Ray. The XBox 360 now has a new HD-DVD player that can be attached to the machine to add new functionality to the system a year after it was launched. On the other hand, the Playstation 3 has a Blu-Ray player built right in. Unsurprisingly, Microsoft is a supporter of the HD-DVD format while Sony is the pusher for the Blu-Ray system. Both companies have placed their bets in this new format war and the game consoles are the front line soldiers in the fight. Looking deeper, the two consoles are also advertisements for the cutting edge of the computing industry. The Playstation 3 contains an IBM cell-processor which could later become the basis for high end computing in other industries. The XBox 360 is designed around high-end components that the user could put in his own PC, powered by Windows of course. Both of the systems are pushing the other new standard that is emerging, high definition video. Both claim, with the XBox planning on releasing a HDMI connector in the XBox, that they will push 1080p content to the beautiful plasma television hanging on the wall.
Then there is the third player, the upstart. The Nintendo Wii is based around older, less powerful technology, without a concern for the pushing of 1080p video. The system is also basing its discs on the old DVD standard. This older technology is providing a price advantage for the Wii, allowing it to the market at several hundreds of dollars below the competitors. But the Wii is pushing a new way to play. Immersive gaming, which attempts to pull the player into the game and be a true part of the experience. Intuitive gaming, which allows novices to pick up the controller and join in. Further, Nintendo is continuing to push the hand-held game console which may end being the ultimate goal.
So what does this mean ten years form now. This is the cop-out answer, but I just don't know. Now that the Wii has become a modest success one can assume the larger players will bring immersive, intuitive gaming to the proprietary consoles, while boasting more powerful systems and graphics. And one cannot say whether there will be a new format war which will help drive new systems. One can at least assume that Nintendo will be around in some form, which I believe is the only constant. Ten years is a long time in the video game industry.

Sunday, January 28, 2007

A new Perspective on HD-DVD v. Blu-Ray

While not an attempt to be risque or to offend any reader, however this article, originally written for Computerworld and republished by Macworld, speaks about the choice made by many of the larger video pornography producers to begin releasing films on the Blu-Ray format. As the article points out, one of the reasons VHS ultimately won over Betamax was the choice of the pornography industry to support the VHS format.

Read at http://www.macworld.com/news/2006/05/02/pornhd/index.php?lsrc=mwrss

Memo #1

Memo #1

The Internet Corporation for Assigned Names and Numbers (or ICANN) and the World Intellectual Property Organization Arbitration and Mediation Center are two examples of international bodies engaged in the process of setting standards worldwide. ICANN helps administer the assigning and registration of the top-level domain names on the Internet. As part of the administration of domain names, ICANN adopted the Uniform Domain-Name Dispute-Resolution Policy. The goal of the policy was to create a streamlined, international process to arbitrate domain name disputes. The Policy was handed down by ICANN and set the rules and standards by which the disputes would be decided. Amongst the rules that were established were a list of appropriate disputes, the type of evidence that must be presented, and the standard by which the arbitration panels decide disputes. The evidence includes registration and use of a particular domain name in bad faith, such as evidence of "cyber-squatting" defined as sitting on a domain name which a trademark owner would be entitled to use. The infringer would simply be retaining the domain name for the goal of obtaining a settlement for transference of the domain name.
As stated, ICANN established the standards and rules for the domain name disputes. The disputes are then decided through various arbitration panels. For this memo, the focus will be on the WIPO Arbitration and Mediation Center, however the National Arbitration Center located in the United States is another body that handles disputes arising under the Uniform Domain-Name Dispute-Resolution Policy. Each of these bodies must be properly accredited by ICANN in order to handle domain name disputes, and each of the bodies must provide a conduit for review by courts with appropriate jurisdiction. In the United States, this conduit was codified by the AntiCybersquatting Consumer Protection Act which provided Federal court jurisdiction for just such disputes. Between 1998 and 2000, the WIPO handled five thousand domain name disputes. However, with ICANN's addition of several new top-level domain, such as .biz and .info, the WIPO has bow handled twenty-five thousand separate disputes.
The arbitration proceedings are surprisingly efficient, taking only forty to fifty days to obtain a decision. The process is also flexible, allowing for a dispute to be decided by a single panelist for a $1,500 fee or three panelists for a $4,000 fee. Further, the disputes do not need to be filed by a lawyer. One limitation is the types of relief available through the panel. The panel can merely transfer the domain name to the winning party (or retain the domain name if the winning party is the defendant). Or the panel can simply cancel the domain name registration. There is no monetary relief available, and the panel relies on the parties to implement the decisions.
While not a perfect solution, the WIPO Arbitration and Mediation Center, in conjunction with the ICANN policies, is attempting to create a worldwide standard by which Domain Name disputes are adjudicated.

Monday, January 22, 2007

Personal Background

As an Introduction to everyone in the Standards and Standardization Class, my name is Kevin Bell, and I am a Third Year Law Student at the University of Colorado School of Law.

I am originally from Colorado Springs, Colorado, and I arrived in Boulder by way of San Antonio, Texas (Trinity University for my undergrad studies). My family primarily hails form South Dakota, but I married a Texan. We welcomed our first child, a daughter, this past year.

My undergrad degree was in History/Political Science, with a minor in Economics.

I have previously worked for the United States Attorney's Office for the District of Colorado, Thomson West (the provider of Westlaw services) and currently work for Catalyst Repository Systems and the intellectual property firm Townsend and Townsend and Crew.

My interest in Telecommunications stems from my desire to work in the high technology intellectual property field, particularly trademark and copyright issues. The world of standards, especially the development of proprietary systems protected by patents, trademarks and copyrights, overlaps my chosen field of study.