Popping no-coverage bubbles in citywide WiFi networks

A Rice University graduate student working with advisors at his institution and Hewlett Packard developed and tested a technique that could dramatically improve how outdoor, metropolitan-scale WiFi networks are planned and deployed, while also reducing the cost and time involved in getting a network's footprint right. HangZhou Night Net

PhD candidate Joshua Robinson developed predictive techniques for where dead zones would occur in WiFi networks. His work allows the use of simple two-dimensional terrain, or zoning maps, with relatively little detail as a starting point, and, for a deployed network, can increase accuracy with only a few measurements per square kilometer to produce surprisingly spot-on data about where holes exist.

Robinson also found that he could predict with definable accuracy how well varying densities of WiFi nodes—the number of radios hung in a given area, on average, from utility poles or other locations—would result in dead zones.

The work described in "Assessment of Urban-Scale Wireless Networks with a Small Number of Measurements" presented at the MobiCom '08 conference, run by the Association for Computing Machinery (ACM), relies on analyzing the terrain of an area and then dividing coverage up into radiating sectors that have relatively similar characteristics. In practice, the number of these sectors can be reduced or increased to improve accuracy.

Dead bubbles in a sea of WiFi froth

Dead zones are the curse of city-wide Wi-Fi, as evidenced in the complaints by users of many of the earliest and largest of those networks. Robinson and his advisor's paper, released earlier this week, documents that some dead zones were as small as 10 meters in diameter with access available just beyond the perimeter—almost like bubbles of no access within a froth of WiFi.

His results defy some conventional wisdom, which says that more nodes mean better coverage. In 2005, metro-scale WiFi network equipment makers often said 20 to 25 nodes per square mile were needed; by 2007, that number had risen in practice to 40, to even 70, to ensure both seamless outdoor coverage and indoor coverage using a wireless bridge with a signal booster.

Rather, Robinson found in examining Google's free Mountain View WiFi network, above a certain low node density point, adding nodes has only a tiny, and thus expensive, additive effect in filling in holes. He estimates that quadrupling the current density of roughly 17 nodes per square kilometer would only reduce coverage from 25 percent of the network's area to 10 percent.

These results might explain the failure of many municipally sponsored, privately built networks to please residents. It wasn't that the networks were underbuilt, but that it may simply be impossible to provide the coverage necessary with Wi-Fi in the environments in which they're deployed without spending far too much.

It's not all about the nodes

Robinson told Ars that node density turns out to not be the defining characteristic of whether and how frequently dead zones occur. It's not "just because they didn't put enough nodes out, or they didn't pick the right places," he said. Some dead areas exist "not because there aren't enough nodes around, but because they're just in a bad location for getting a good signal."

Most large-scale WiFi networks are planned using a combination of wardriving-like measurement, terrain and building data, and simulation. Rough ideas are tested with small deployments, and the model corrected. Some planning software allows building material types and vegetation to be noted.

But Robinson thinks quite a lot of this could be done away with, especially for groups like the Technology For All (TFA) network developed by Rice that he's worked on, which brings Internet service to an underserved Houston area. The nonprofit doesn't have the resources to spend tens of thousands of dollars for measurement software or consultant contracts. "I wanted to show that you don't actually need all the complication, you don't need to know what a building is made of, you don't need to know what types of trees there are," he said.

He drove the streets of Mountain View and gathered 35,000 GPS-tied samples, while also collecting 29,000 samples from the TFA network. The two networks have somewhat different characteristics: Rice's Wi-Fi nodes are placed high to avoid obstructions by buildings, but signals pass through substantially more tall trees than in Mountain View. (Robinson and TFA have made these measurements available for download.)

Robinson said he took all these measurements to be able to see how well his model performed, and whether taking many measurements would provide dramatically better results than plugging a 2D map into his model.

In his paper, he notes that more than 10,000 measurements per square kilometer appear to be required to exceed the roughly 80 percent accuracy that can be achieved with his model. Going into the field and taking a few dozen samples per square kilometer and using them to correct his model's data can push accuracy closer to 90 percent.

For now, his approach and tools are more evaluative: estimating and correcting his model to figure out where dead zones are. Robinson said his next paper, already submitted, applies his model to pure planning.

Posted in 杭州夜生活 | Comments Off on Popping no-coverage bubbles in citywide WiFi networks

StumbleUpon makes stumbling easier for new users

StumbleUpon, the company that long ago figured out how we all really use the web, announced a handful of new features today designed to offer a test drive for new users, a richer experience for registered users, and better integration for web site owners. The company has had a rocky year, but these new features may be just what StumbleUpon needs to, well, keep from stumbling. HangZhou Night Net

For those who have never used it, StumbleUpon allows users to sign up, specify preferences for a wide range of topics like politics, photography, history, and web development, and then literally "stumble" through sites that other, similar users mark as interesting. Users can pick a specific category or type of media to browse (such as photos and videos), or stick with Stumbling through friends' sites to play it safe. If you're thinking "like Digg, but more fun and no rabid voting or foaming comments," you're not far off.

Bring us your tired, your hungry, your Stumbling

Until today, however, users had to register with StumbleUpon, create a profile, and install an add-on for IE or Firefox just to start Stumbling. If you're a social media pro armed with form-filling tools, this probably isn't an issue, but there are plenty of users out there not willing to sign up for yet another service that, at first glance, may sound like Google on steroids. Toss in the fact that you have to install a new piece of software, and the barrier to entry is raised even further.

Now, new visitors to StumbleUpon.com will not only see a redesigned homepage that highlights popular content and ratings from the community, but they can click a "Start Stumbling" button to launch a JavaScript toolbar at the top of the browser window. No software install is necessary, and this should work in any browser, not just IE and Firefox. This toolbar is a bit less functional than StumbleUpon's browser extension, though, as users can't filter for specific kinds of content or, of course, use any of the service's community advantages. But rating sites and Stumbling for more works perfectly well, offering a slice of StumbleUpon's "best of the web" approach. The rub lies within the pseudo-toolbar's "Save" button: clicking it prompts users for registration to actually save their Stumbling and begin participating in the greater community.

This browser-agnostic tool doesn't work for registered users, though: in fact, if you're logged into the site, you'll never see it. In the coming months, StumbleUpon plans to introduce a similar tool for registered users so they can Stumble from any computer without installing the browser extension. Along with this tool, a redesign for the rest of the site will focus on user profiles and site navigation, as well as the rating and commenting systems.

Stumble right here

The other half of today's announcement is a Partner Program for web site owners to harness StumbleUpon's discovery services and the content its users create. Launched today with two partners—HuffingtonPost and HowStuffWorks—and more on the way, a new "Stumble!" badge that "premier publishers" can add to their site allows visitors to focus StumbleUpon's "show me something else interesting" approach on the current web site. Once invoked, StumbleUpon's aforementioned JavaScript toolbar appears for both registered and unregistered users, with a basic set of rating tools and a "Stumble!" button to keep the good times stumblin' at a specific URL.

After playing with this new tool at the two partner sites that went live with it today, the Stumble badge left us with mixed feelings. On the one hand, it does a good job of bringing the StumbleUpon experience to a specific site, but it also adds more clutter and yet another navigational system that, for all intents and purposes, is mostly an "I'm feeling lucky" gimmick. In an age where site operators are rolling up their sleeves and building plenty of their own "check out what else we're doing" navigation tools, this Stumble Partner Program could overwhelm users who need a paddle—not a blindfold—when sailing the Internet's seven seas.

That said, StumbleUpon does have a very healthy community and moves its fair share of traffic, so inviting that external community to come have a digital picnic at one's site may not be a bad thing for exposure and pageviews. StumbleUpon's registered user base has steadily increased to 6 million strong, despite eBay putting it up for sale less than two years after acquiring it. By opening its doors, integrating more deeply with external sites, and adopting a "try before you buy register" philosophy, StumbleUpon may very well boost its growth and, subsequently, drive even more traffic.

Posted in 杭州夜生活 | Comments Off on StumbleUpon makes stumbling easier for new users

Researchers disclose deadly cross-platform TCP\/IP flaws

DoS attacks have been around ever since the first caveman hacker decided to attack the first caveman network engineer's TCP/IP network. Much like sharks, DoS attacks have survived the passage of time by being very good at what they do, and while they've spawned offspring (distributed denial-of-service attacks, or DDoS), the original version remains alive and well in the deep waters of the Internet. A team of researchers—Robert E. Lee and Jack C. Louis—now claim to have found new vulnerabilities within the TCP/IP stack that can be exploited to allow for devastating DoS attacks, from simply crashing the device in question to snarling it so thoroughly it must be rebooted before it can function normally, even after an attack has been completed. HangZhou Night Net

The problem is, Lee and Louis aren't willing to say much more than that. There's an extensive interview with the pair over at a site in the Netherlands with a name I could theoretically type, but you wouldn't remember, so I'll just hand over the URL (via Slashdot). The English section of the podcast starts around the five-minute mark, but some of you may enjoy the Dutch bits in front—I know I did. Throughout the course of the discussion, the two men detail how they stumbled on these TCP/IP vulnerabilities by accident some years ago, and what they've done since to document and explore the problem.

It's only fair, at this point, to note that Lee and Louis both come across as sober, professional individuals who are extremely knowledgable on the topics they discuss. Despite how the press may spin their statements, neither man recommends a Chicken Little type of reaction, and both speak out against such hysterical hijinks. The two have developed their scanning software (Unicornscan) and a mysterious other bit of software known as "Sockstress" since they first found hints of the vulnerability back in 2005. All we know about Sockstress, at least at the moment, is that it's designed to do "evil things" while negotiating a three-way handshake.

Whatever "evil things," Sockstress does, it's apparently quite good at them. According to the duo, they've found significant vulnerabilities in every security system they've tested, and have yet to see a TCP/IP stack that isn't vulnerable to their attack methods. Jack Louis has documented five valid attack methods, and believes 30 or more may exist, depending on the specifics of the stack implementation. Currently, the two know of no TCP/IP implementation that's invulnerable, adopting IPv6 doesn't help (and can actually make the problem worse), and there's no anti-intrusion software that can help.

What we do know is that the problem here is caused by trust. Once the three-way handshake has been completed, Sockstress is apparently free to do whatever it does with no fear of reprisal. There aren't many details available past this point, but the two do draw certain comparisons between the DNS issues we saw this past summer and this present bug. Vendor response, thus far, has been quite different in this case—according to Louis and Lee, they've had very little luck in convincing vendors that they've actually hit on a problem, despite being able to demonstrate attacks that are more than simply proof-of-concept assaults.

Unless the two security researchers have gone completely wonko, they've hit on something, and the fact that they're arguing for a calm, rational, and coordinated approach to the issue helps deflate any accusations that the men are simply after headlines. This particular boogeyman seems made of something a bit more solid than nightmares and an open closet door, but precisely what is underneath the sheet remains unknown and will stay that way through the immediate future.

Posted in 杭州夜生活 | Comments Off on Researchers disclose deadly cross-platform TCP\/IP flaws

Apple finally drops NDA, iPhone developers rejoice

Apple has announced via its Apple Developer Connection website that it has dropped the NDA that has left iPhone developers frustrated since the release of iPhone OS 2.0 this past July. In a note addressed "To Our Developers," Apple finally admitted that the NDA had "created too much of a burden on developers, authors and others interested in helping further the iPhone’s success."HangZhou Night Net

Apple had, like most companies, attached a nondisclosure agreement to developers that wanted early access to the SDK before iPhone OS 2.0 was released. Nearly everyone involved in iPhone development expected the NDA to be lifted once 2.0 was officially released. Apple unexpectedly kept the NDA in place, however, and offered no explanation to developers—nor did it offer a timetable on when it might be lifted. This caused serious problems for those who had books ready to be published to help new developers, as well as programmers wanting to post to blogs or forums discussing iPhone programming tips and techniques. The prohibition on blog and forum posts also made it difficult for developers to get help when they run into trouble.

There was a lot of speculation about Apple's reasoning for keeping the NDA in place, including slow-moving bureaucracy and sheer ignorance, though protecting IP seemed most reasonable. Apple said in Wednesday's note that protecting IP was its reason for keeping the NDA in place. "We put it in place as one more way to help protect the iPhone from being ripped off by others," reads the note.

From what we're seeing on Twitter, the collective iPhone developer community couldn't be more delighted. "Ok. I'm back in on iPhone dev," said Second Gear's Justin Williams. Jonathan Wight, developer of the TouchCode frameworks, said, "I now have a whole bunch of code to put online." Addison-Wesley Senior Acquisitions Editor Chuck Toporek, on his way to inform production to start printing The iPhone Developer's Cookbook, stopped to say, "Thank you, thank you, thank you!" Even one of the NDA's most fervent detractors, Twitterrific developer Craig Hockenberry, joked, "The downside, of course, is that I need to find something else to be bitter about. Anyone want to get on my lawn?"

"I think this will really open the floodgates for iPhone development. An open exchange between developers is crucial on any platform, and with the shackles off, iPhone will flourish," says Ben Gottlieb, developer of Crosswords for iPhone. "Kudos for Apple to listening to the developer community and adjusting their position. Any increase in transparency is a welcome change, and hopefully this is a sign of more good things to come," he tells Ars.

Though Apple has said nothing about the recent issues regarding App Store rejections, this is certainly a major step toward regaining developers' trust. Of course, Apple will still keep unreleased versions of the iPhone OS and beta SDKs under NDA. But as with Mac OS X, iPhone developers are free to publicly discuss any APIs or other features in released versions. This will certainly go a long way towards maintaining a large and vibrant developer community.

The NDA's demise also means that we can finally run our in-depth probe into the iPhone SDK. Look for it this evening.

Posted in 杭州夜生活 | Comments Off on Apple finally drops NDA, iPhone developers rejoice

Tracing the origin of HIV-1

Viral archeologists have been trying to track down and understand the spread of HIV-1, the strain responsible for most HIV infections worldwide. In particular, researchers are interested in the M (major) group, which is responsible for over 90% of the infections. Although scientists first identified HIV as the cause of AIDS in 1984, the virus has been spreading among humans long before that. Since HIV evolves up to a million times faster than we do, it is advantageous to examine and compare HIV-1 samples from as early as possible. Finding these samples is the limiting factor. Only one early-period HIV-1, called ZR59, was known about until now, but today's Nature unveils another "ancient" sample, giving us a more complete understanding of HIV-1's prehistory. HangZhou Night Net

In a collaboration, eight research groups isolated and studied partial HIV-1 genome fragments from a lymph node that was obtained in 1960 at what is now known as Kinshasa, Democratic Republic of Congo. This "ancient" virus, dubbed DRC60, shares about 88 percent sequence similarity to ZR59. Out of the 11 classified subtypes of group M HIV-1 strains (A-K), DRC60 is most closely related to the ancestor of subtype A, while ZR59 is associated with the one for subtype D. This difference shows that diversification of HIV-1 was already underway half a century ago.

As both samples come from Kinshasa, a comparison of their sequences can reveal the existence of a common ancestor. By performing statistical analyses with models that account for the rates of evolution and various methods of HIV-1 pandemic spread, the researchers found that the viral ancestor existed between 1902 and 1921.

Even though only two "ancient" sequences from Kinshasa are known, scientists had already inferred that the ancestors of all group M viruses originated there, as it is the location with the most diverse number of group M subtypes in modern times. If that is the case, the simian immunodeficiency virus (SIV) that plagued chimpanzees jumped to humans at Kinshasa in the early 1900s and was likely to be related to M group viruses.

This possibility raised a couple of additional questions. First, chimpanzees infected with SIV that are most similar to group M strains live about 435 miles away from Kinshasa, in the southeast corner of Cameroon. So, why did the virus originate in Kinshasa? Second, what made the early 1900s particularly favorable for the start of an infection spread?

The researchers point to the rise of cities as the reason. Kinshasa was known as Léopoldville at the time, and its location near the mouth of the Congo River made it a key transportation spot. When populations grew in the early 1900s, Léopoldville became the largest city in the area. Infected chimpanzees and/or their handlers could have easily arrived there, and the sufficient population density allowed the virus to take hold.

The isolation of DRC60 allowed researchers to make significant progress in their efforts to trace the origin of HIV-1. Samples preserved in other hospitals in the region hold the promise of further discoveries of other early HIV strains, adding more details to the history.

Nature, 2008. DOIs: 10.1038/nature07390 and 10.1038/455605a

Blizzard awarded $6 million in damages from WoW bot maker

The case Blizzard brought against bot-maker MDY Industries has been going on since 2006, and while a judge ruled in July that MMOGlider infringed on Blizzard's copyrights, the question of whether the bot violates the DMCA is still open. That has not stopped the judge from awarding $6 million in damages in the case. HangZhou Night Net

It's unknown how much money MDY Industries has made from its product MMOGlider, which allows users to automate the boring parts of World of WarCraft and essentially grind forever with no user involvement, but the $25 program had sold around 100,000 copies as of last year. In other words, the product was big business. Unfortunately, it also violated the game's terms of service.

"Blizzard owns a valid copyright in the game client software, Blizzard has granted a limited license for WoW players to use the software," Judge Campbell wrote in the July judgment. "Use of the software with Glider falls outside the scope of the license established in section 4 of the TOU, use of Glider includes copying to RAM within the meaning of section 106 of the Copyright Act, users of WoW and Glider are not entitled to a section 117 defense, and Glider users therefore infringe Blizzard’s copyright."

This is troubling and creates a chilling precedent. It basically says that the copy of the software created in your system's RAM is okay under the license, and when you break the EULA by using software like MMOGlider the license is revoked, and you are suddenly left playing an unlicensed copy of the game. This turns what is a normal terms-of-service issue into a copyright issue, and could potentially affect anyone running an instance of this software. Blizzard now has the option of going after anyone it believes is running a bot for copyright infringement. While at first glance unlikely, we've recently learned that Activision Blizzard has become very aggressive in its copyright litigation.

It could have been far worse for MDY, as the judge denied Blizzard's request for trebled damages. There is a possibility Blizzard could appeal that decision, and the two parties will be back in court to finalize the settlement and issue of DMCA violations in January 2009. There are many issues at stake in this case, including how far copyright law can be extended to cover third-party game modifications, so this is certainly something we'll be watching closely.

Posted in 杭州夜生活 | Comments Off on Blizzard awarded $6 million in damages from WoW bot maker

Sonic Chronicles: The Dark Brotherhood. No. Just… no

To see that there are some conceptual problems with Sonic Chronicle: The Dark Brotherhood is putting it lightly. Sonic has always been at his best when he runs very quickly in platforming levels, and everything that moves away from the formula ends up taking away from the experience. Fishing levels, silly animal friends, adventure stages… just let the poor guy run fast. That's what he does. Giving BioWare reign over a series that only works well with arcade-y action and asking them to turn into into an RPG may sound like a dream team, but it's nothing but a wasted opportunity. HangZhou Night Net

If BioWare wanted to make a DS RPG, it should have started with characters that fit with an RPG instead of trying to write dialog trees about having to find those Chaos Emeralds again. Because, of course, they have gone missing. Keep in mind that you never control any of the characters directly, you tap your stylus on the screen to move, and the downward-facing camera isn't doing anyone any favors. At least you can finally do something with all those rings: they act as currency in the title.

The game seems to go out of its way to ignore what makes both Sonic and BioWare great. BioWare has to deal with remarkably shallow characters and a story that belongs in an action game. Sonic has to deal with the fact that he's in an RPG. Yes, I love bacon, and yes, I love Nerf weaponry. The problem is I have enough common sense to realize stapling bacon to a Nerf gun doesn't make either of those two things better. Sonic Chronicles is the bacon-studded Nerf gun. Singularly, these things would be awesome. Mixed together? Ruined.

The game is easy, the characters' special powers are a cheap way to control when you can access different areas in the linear progression of the game, and combat is done by making motions on the screen or playing Elite Beat Agents-style tapping games with circles. A rhythm section shoehorned into a game stunk up Secret Agent Clank, and the corpse didn't get any sweeter in Sonic Chronicles.

This isn't a review, because honestly I only put a few hours into it, and I know people may scream that the game might have picked up at hour five or eight or whatever. So what? It's not worth fighting through the first few hours of mediocrity in the hopes it will pick up. I'd rather play an RPG with better characters and a better story. I'd rather play Sonic in a 2D platformer. I'd rather play a game where they didn't start with a fundamentally flawed concept, only to be hamstrung by it every step of the way.

Sega has a history of continually ignoring what makes Sonic great and giving us many, many games that feature Sonic doing crap he's not designed for. I don't know how many times gamers need to yell "NO!" while hitting the company with a rolled-up newspaper, but clearly it hasn't yet been enough. Sonic Unleashed, for instance, has were-hog levels. Let that sink in for a bit.

Posted in 杭州夜生活 | Comments Off on Sonic Chronicles: The Dark Brotherhood. No. Just… no

Cox enforces acceptable use policy, lies to its customers

Over the last few years, content owners have tried a variety of approaches to combat the sharing of copyrighted files, but of late, their attention has focused on a basic solution: kick the pirates off the Internet entirely. A so-called "three strikes" policy would see ISPs provide users that had been caught sharing copyrighted material with two warnings, after which they would be disconnected. Even though most of the three strikes action has occurred in Europe, one US ISP has apparently implemented it, and justified its action with a spurious argument: the DMCA made them do it. HangZhou Night Net

Internationally, content owners have been excited by the prospect of three strikes-regulation. The idea would be that copyright holders could notify ISPs of people engaging in filesharing of unlicensed content. The ISPs would, in turn warn the user about the legal dubiousness of this activity, and provide them with hints as to how to secure their networks and eliminate P2P software from their machines. If two warnings aren't sufficient, the ISP would simply disconnect the user on the third offense.

So far, however, the closest the policy has come to implementation is in the UK, where some ISPs have voluntarily agreed to send out warning letters, but haven't agreed to actually pull the plug on anyone. In fact, the European Parliament has just taken a major step to block that from ever happening. An amendment to a telecom bill would require that any disconnection be reviewed by the courts, which would probably make the process as painful, if not more so, than filing a lawsuit and seeking damages.

In the US, the RIAA has gone for the lawsuit/damages approach, and there has been little talk of agitating for a three-strikes law. Accusations are flying, however, that this may be a case of all action, no talk. TorrentFreak has posted the tale of one of its users, a Cox Communications subscriber, who has apparently been disconnected after what he claimed was his third strike. The report came complete with a screenshot of the warning page the user was referred to.

We were unable to find the text of this page by searching Cox's site. Assuming it's accurate, however, the most striking aspect of the page is that Cox claims it is required to take this action by the DMCA, a claim that is simply false.

There doesn't seem to be any reason to lie here; sharing copyrighted material is against Cox's acceptable use policy, so the company appears to have every right to terminate service. A spokesman for Cox told Ars that the screenshot simply reflects part of the process by which the company responds to a DMCA takedown notice. The company considers it essential to alert its customers when they are the target of these, and attempts to do so by e-mail. Only when that fails do users wind up having their browser redirected to the warning page.

Cox estimates that it has received hundreds of thousands of DMCA complaints, but has terminated accounts in "less than one-tenth of one percent" of these cases. As he described it, there is nothing like a three strikes policy in place.

There seem to be two problems, however, with this course of action. The first is that the company is needlessly muddying the waters in an area of law (the DMCA) that has become more and more significant to computer users as user-generated content has increased in popularity. Perhaps more importantly, however, Cox is pursuing its policy following unsubstantiated accusations of copyright violations, precisely the sort of action that the EU has decided was not going to cut it. After all, it's possible to get DMCA takedown notices sent to printers.

Posted in 杭州夜生活 | Comments Off on Cox enforces acceptable use policy, lies to its customers

“Orphan Works” copyright reform fails in wake of bailout bid

The Orphan Works Act of 2008 was passed by the Senate last week, but the House failed to take action before taking off for a couple of days, and the bill may be dead until after the fall election. The bill would have loosened restrictions on using copyright-protected works that have been abandoned by their creators. It has faced strong opposition from copyright holders who fear it could create loopholes that would insulate perpetrators of commercial copyright infringement. HangZhou Night Net

The Senate passed the bill after making changes intended to clarify the language and provide a higher level of specificity. The House version of the bill states that individuals are entitled to make use of a work after conducting a "diligent search" for the copyright holder. That aspect of the bill generated controversy and was regarded by some in the content industry as excessively vague. Critics were concerned that the copyright office, which would ultimately have the task of interpreting the legislation and defining the standards for diligence, would be unable to devise consistent requirements.

The final version of the bill included an expanded definition of "diligent search" that instructs users to seek expert assistance and look in Internet databases and Copyright Office records to try to ascertain the rights-holder before using a work without permission. Specifically, users must first search the relevant Copyright Office records, search for the owner in "reasonably available" sources of information, use technology and printed publications, and search various databases, including those available online.

In fact, the bill specifies that these guidelines are merely minimums, and as pointed out by Public Knowledge, it also asks the Copyright Office to take into account comments from the Small Business Administration Office of Advocacy in order to determine best practices. This was added, no doubt, in order to quell critics' fears that the Copyright Office was not well-equipped to handle the job entirely on its own.

Unfortunately, after the Senate passed its version of the bill, it was sent to the House just as all hell broke loose over the current economic situation in the US, leaving it lost in the fallout from the failed attempt to pass the $700 billion bailout bill. As a result, lobbyists have apparently told Wired that the House isn't likely to take up the Orphan Works Act until after the November election.

Posted in 杭州夜生活 | Comments Off on “Orphan Works” copyright reform fails in wake of bailout bid

Circling the drain? AOL axing two more services this month

It looks as if AOL is desperately trying to right a sinking ship as the company continues to try and get rid of dead weight. Two of the once-popular online portal's services now have until the end of the month before they turn off the lights and users are sent elsewhere. HangZhou Night Net

The two services being cut are AOL Journals and AOL Hometown. The former is a blogging service for members (a bit like Blogger and WordPress), while the latter lets users to create and host their own web pages. In a short blog entry, the AOL team offers no explanation for the closings, but it did give information on how users can save their Hometown/FTP information and migrate their blogs to other blogging services. Affected users are also being notified of the closings by e-mail.

In a way, it's not very surprising that these two services are being cut. They are, after all, fairly redundant in a world where there's a plethora of other (and probably better) blogging and web page creation/hosting services available. There can't have been that many people using those services through AOL, and similarly, there can't have been that many people working to keep them running either.

As for the rest of AOL, it's no secret that it continues to be a revenue drag on its parent company, Time Warner. Although AOL managed to bring in more than a billion dollars in revenue last quarter, the sources of that revenue highlights AOL's decline as an ISP. Web-based advertising actually saw a slight increase in income while domestic subscriber numbers dropped by over a quarter, causing a plunge in income that wiped advertising's success off the balance sheets several times over.

Given AOL's circumstances, TimeWarner is supposedly about to be back in talks with Yahoo over a possible merger. Yahoo's board recently cleared the way for new discussions between the two companies, which could result in Yahoo gobbling up AOL in exchange for a stake in the combined company for Time Warner. Microsoft is also said to be in talks with AOL, but that may be partially because it's trying to drive up the asking price for the company and ultimately making a buyout by Yahoo more costly.

As to whether cutting off AOL Journals and AOL Hometown will actually make a difference for AOL, it probably won't. However, trimming the fat will at least help the company in its negotiations with others, as whatever's left whenever an eventual merger takes place will be the things that matter most to AOL's core business.

Further reading:Found via Silicon Alley Insider: More AOL Shutdowns: Journals, Hometown Axed

Posted in 杭州夜生活 | Comments Off on Circling the drain? AOL axing two more services this month

GIMP 2.6 released, one step closer to taking on Photoshop

A new release of the venerable GNU Image Manipulation Program (GIMP) is now available for download. Version 2.6 offers a variety of new features, user interface improvements, and is also the first release to include support for the Generic Graphics Library (GEGL), a powerful, graph-based image editing framework. HangZhou Night Net

The GIMP is an open source graphics editor that is available for Linux, Windows, and OS X. It aims to provide Photoshop-like capabilities and offers a broad feature set that has made it popular with amateur artists and open source fans. Although the GIMP is generally not regarded as a sufficient replacement for high-end commercial tools, it is beginning to gain some acceptance in the pro market.

One of the most significant limitations of the GIMP is that it has traditionally only supported 8 bits per color channel. This weakness is commonly cited as a major barrier to GIMP adoption by professional artists, who require greater color depth. This problem has finally been addressed by the new GEGL backend, which delivers support for 32-bpc. The inclusion of GEGL is a major milestone for the GIMP and takes it one step closer to being a viable Photoshop replacement for professional users. In this release, GEGL is still not quite ready to be enabled by default, but users can turn it on with a special option.

GIMP 2.6 also includes some minor user interface enhancements. The application menu in the tool palette window has been removed, and its contents have been merged into the document window menu. A document window will now be displayed at all times, even when no images are open. The floating tool windows have also been adjusted so that they are always displayed over the document window and cannot be obscured. To reduce clutter and make the windows easier to manage, the floating windows will no longer be listed in the taskbar.

The GIMP user interface has long been a source of controversy, and is characterized by some users as one of the worst on the Linux desktop. The modest changes made in this release are nice improvements, but probably won't be enough to satisfy the most vehement haters. A more extensive redesign is in the works, however, and the developers are gathering insight from users and experts. The empty window behavior in version 2.6 is based on one of the first specification drafts that emerged from the redesign project.

There are a number of important functionality improvements that will be welcomed by users, too. The freehand selection tool now has support for polygonal selections and editing selection segments, the GIMP text tool has been enhanced to support automatic wrapping and reflow when text areas are resized, and a new brush dynamics feature has added some additional capabilities to the ink and paint tools. Version 2.6 also has a few improvements for plug-in developers, like a more extensive scripting API for manipulating text layers.

For the next major release, the developers plan to improve GEGL support and integrate the development work that was done in Summer of Code projects. One of the Summer of Code projects that could land in 2.8 brings support for editing text directly on the image canvas, thus obviating the need for a text input dialog. Another project that we could see in 2.8 adds support for marking specific brushes and gradients with tags so that they are easier to find and organize.

Users can download the latest release from the GIMP web site. For more details about the new version, check out the official release notes.

Posted in 杭州夜生活 | Comments Off on GIMP 2.6 released, one step closer to taking on Photoshop