Feed aggregator

Where Copyright Fails, Open Licenses Help Creators Build Towards a Future of Free Culture

EFF Breaking News - Fri, 24/10/2014 - 00:13

One of the convictions that drew law professor and former EFF board member, Lawrence Lessig, to co-found Creative Commons was that a narrow and rigid application of copyright law made no sense in the digital age. Copying digital information over long distances and at virtually no cost is what the Internet does best; indeed, it wouldn't work at all if copying wasn't possible.

If all online copying requires permission—a worldview that Lessig has termed permission culture— then a huge part of our modern systems for conveying and creating knowledge will always require explicit and prior permission to operate to avoid risk of future lawsuits. It is permission culture that leads to absurd results such as the criminal charges levied against Diego Gomez for sharing an academic publication with colleagues online.

Creative Commons—and by extension, the broader open access movement that often relies on Creative Commons licenses—pushes back against this worldview, in favor of an alternative vision of free culture, in which creative and knowledge works are freely exchanged, and where demanding permission for re-use and sharing can be the exception, rather than the rule.

CC helps copyright law serve its real purpose, making sure that a system intended for narrow permissions and exceptions does not impede  the freedom to share. Creative Commons and similar open access licenses use copyright law to assure users that they have the liberty to copy and share works, and depending upon the choice of license by the author, also the copy to modify them and to distribute modified versions. (Free and open source software licenses work in a similar way.)

But however clever this is, should we be using copyright law—a regulatory system that many believe defaults to requiring authorization —to help guarantee access to knowledge and freedom to share? Some individuals particularly in the free and open source software community have answered “no.” These free and open source software developers reject outright the authority of copyright law to govern the use of the code that they write. This has led to the phenomenon of so-called POSS (Post Open Source Software), whereby developers simply commit their code to openly available code repositories like Github, and express their disdain for copyright law by deliberately refraining from choosing a license. Unfortunately, this practice casts the reuse of the code into a legal grey zone. Code that is not clearly licensed can be confusing for would-be users, because the default assumption is that most copying and reuse will be infringing if the author hasn't permitted it.

In recent years a crop of software licenses have also emerged, such as the Unlicense, and others under more colorful names, that seek to reconcile the fact that programmers and many of their users don't care about copyright law, with the reality that other users of software, and judges, do. For creative works, the CC Zero license serves a similar purpose. Just as the rest of the Creative Commons licenses are an attempt to reflect the desire of authors to do away with the “permission required by default” model of copyright, these licenses attempt to recreate for works the same freedoms users have over material that has passed out of the realm of copyright into the public domain.

These sorts of public domain dedications and licenses are a good compromise, and an important addition to the existing pantheon of free culture and open source licenses that preceded them. As Creative Commons board member Michael Carroll put it before Congress earlier this year, “Some copyright owners feel like they want the option to get out of the copyright system.” Using a legal instrument to opt out of the copyright regime altogether, to the extent the law allows, meets this need.

But those who reject copyright licensing entirely typically do so not because they misunderstand that their code or writing is automatically subject to copyright; rather, they are doing so as a political statement that they don't believe this should be the case. Using a public domain dedication or permissive license that accepts the jurisdiction of copyright law over your work is seen as acceding to the rules of permission culture; refusing to accept this, as quixotic as that may be, is seen as subverting those rules.

Omitting a legally-binding license entirely from a work, while asserting in straightforward language your disavowal of a belief in such licenses, can be a statement about the current state of copyright. In practical terms, however, the existence of modern copyright law works to undo that statement, by dissuading users from taking advantage of such works because of the legal gray area within which they must operate. Copyright law remains a regime to be carefully stepped around, instead of being modernized and fixed at root to offer clearer, simpler choices for creators and users.

Both EFF and Creative Commons have made joint statements aimed at more thoroughly reforming copyright law to make it more fit for purpose in the digital age. As CC's Timothy Vollmer wrote last year:

the existence of open copyright licenses shouldn’t be interpreted as a substitute for robust copyright reform. Quite the contrary. The decrease in transaction costs, increase in collaboration, and massive growth of the commons of legally reusable content spurred on by existence of public licenses should drastically reinforce the need for fundamental change, and not serve as a bandage for a broken copyright system. If anything, the increase in adoption of public licenses is a bellwether for legislative reform — a signal pointing toward a larger problem in need of a durable solution.

We celebrate the open access movement this week, and the work that academics and readers all around the world do to share knowledge as freely as they can. But we must not forget the desire for a future in which the open access movement wouldn't even be necessary, because open access to our knowledge commons, and particularly academic research, will be the default assumption, rather than the other way around.

Between October 20 and 26, EFF is celebrating Open Access Week alongside dozens of organizations from around the world. This is a week to acknowledge the wide-ranging benefits of enabling open access to information and research—as well as exploring the dangerous costs of keeping knowledge locked behind publisher paywalls. We'll be posting on our blog every day about various aspects of the open access movement. Go here to find out how you can take part and to read the other Deeplinks published this week.

Related Issues: Fair Use and Intellectual Property: Defending the BalanceOpen Access
Share this:   ||  Join EFF

Civilization: Beyond Earth—Next time, reach for the stars

Ars Technica - Fri, 24/10/2014 - 00:00
A single blue orb floating among billions, part of a galaxy that’s among hundreds of billions, houses the sum total of human achievement. The Sid Meier's Civilization series is one of those achievements, taking the total history of that great, big ball we all live on and condensing it into perhaps the best, and certainly the most popular, 4X strategy game ever made.

Civilization has always held the sanitized, slightly goofy ideal common to all projects bearing Meier's moniker. Maybe Civilization: Beyond Earth's developers felt infinitesimal when considering the vastness of space, or maybe they were simply struck with a distrust of the future common to science fiction. Either way, the latest game in the franchise that all but defines turn-based strategy is a bit less sanitized and a bit more sinister than its predecessors.

For one thing, despite the veneer of technological and social advancement inherent in exploring life on a new planet, the future represented by Beyond Earth is frighteningly similar to that of past Civilization titles. The humans still squabble over resources, land, and ideology, and they do so in ways that are similar to Civilization V from turn one on.

The similarities make Beyond Earth feel more like a sci-fi themed Civ V expansion than a bold new direction for the series. Units are moved the same way; cities are grown the same way; resource tiles are worked in the same way. While the new victory conditions each have some pseudoscience flavor dialogue, winning is still a matter of out-researching or out-fighting opposed factions in more or less the same ways as before.

Read 19 remaining paragraphs | Comments

In win for broadcasters, court shuts down Aereo’s live TV feature

Ars Technica - Thu, 23/10/2014 - 23:40
Aereo on an iPad. Casey Johnston

A New York federal judge has sided with a group of major broadcasters—including Twentieth Century Fox and the Public Broadcasting System—and shut down TV-over-the-Internet startup Aereo’s "Watch Now" system.

"The Supreme Court has concluded that Aereo performs publicly when it retransmits Plaintiffs' content live over the Internet and thus infringes Plaintiffs' copyrighted works," Judge Alison Nathan wrote in her 17-page opinion and order on Thursday.

"In light of this conclusion, Aereo cannot claim harm from its inability to continue infringing Plaintiffs' copyrights. In addition, in light of the fact that Plaintiffs have shown a likelihood of success on the merits rather than just sufficiently serious questions going to the merits, they need no longer show that the balance of hardships tips decidedly in their favor."

Read 5 remaining paragraphs | Comments

Uber Delivers Flu Shots: How On-Demand Tech Can Actually Do Good

Wired - Thu, 23/10/2014 - 23:17

Flu is most dangerous to the most vulnerable---small children, the elderly, the immuno-compromised---and every responsible adult should get a flu shot to help keep the germ from spreading. Typically, this involves a trip to the doctor or a local drug store. But on Thursday, in Boston, New York, and Washington, D.C., there was another option that didn't even involve leaving the house: Uber.

The post Uber Delivers Flu Shots: How On-Demand Tech Can Actually Do Good appeared first on WIRED.

Thursday Dealmaster has a Dell XPS 8700 desktop computer for $799.99

Ars Technica - Thu, 23/10/2014 - 22:15

Greetings, Arsians! The dealmaster is back with a bunch of deals courtesy of our partners at TechBargains. This week the top deal is a Dell XPS 8700 desktop computer. For just $799.99 you get a 2.6GHz Core i7, 16GB of RAM, a 2TB hard drive and a GeForce GTX 745. That's $500 off the regular price. If you current rig is feeling a little sluggish, maybe it's time to upgrade? This and tons more deals are below. For more desktop deals, visit the TechBargains site.

Featured deal

Dell XPS 8700 Core i7 Desktop w/ 16GB RAM, 2TB Hard Drive & 4GB GeForce GTX 745 for $799.99 plus free shipping (list price $1299.99 | use coupon code TQR2JHV6XV?$MP)

Read 6 remaining paragraphs | Comments

Amazon Fire HDX 8.9 (2014) impressions: Deja vu 8.9

Ars Technica - Thu, 23/10/2014 - 22:00
Sam Machkovech

It's tablet season! We're swimming in tablets! The tablet fairy has arrived! Etc., etc., etc. As a result, we want to offer first impressions on devices that might otherwise fall through the cracks—and no high-end tablet fits that bill better than this year's Amazon Fire HDX 8.9, which just arrived at our doorstep.

That's because it's quite easy to mistake this for last year's Amazon Fire HDX 8.9. In fact, typing "Fire HDX 8.9" into Amazon's search bar will bring up last year's model by default, making us wonder why Amazon didn't take the opportunity to, we don't know, add a "point one" to the name. Either way, if you hold both models in your hands at the same time, you're not likely to notice a major difference at first glance.

They share the same weight (13.2 oz), the same dimensions and thickness, the same 2560x1600 display (measuring at, you guessed it, 8.9 inches), the same cameras, and even the same aesthetics, from the massive bezels on the front to the angled, soft plastic shape on the back.

Read 14 remaining paragraphs | Comments

Petter Reinholdtsen: I spent last weekend recording MakerCon Nordic

Planet Debian - Thu, 23/10/2014 - 22:00

I spent last weekend at Makercon Nordic, a great conference and workshop for makers in Norway and the surrounding countries. I had volunteered on behalf of the Norwegian Unix Users Group (NUUG) to video record the talks, and we had a great and exhausting time recording the entire day, two days in a row. There were only two of us, Hans-Petter and me, and we used the regular video equipment for NUUG, with a dvswitch, a camera and a VGA to DV convert box, and mixed video and slides live.

Hans-Petter did the post-processing, consisting of uploading the around 180 GiB of raw video to Youtube, and the result is now becoming public on the MakerConNordic account. The videos have the license NUUG always use on our recordings, which is Creative Commons Navngivelse-Del på samme vilkår 3.0 Norge. Many great talks available. Check it out! :)

Garrett: Linux Container Security

LWN.net - Thu, 23/10/2014 - 21:59
Matthew Garrett considers the security of Linux containers on his blog. While the attack surface of containers is likely to always be larger than that of hypervisors, that difference may not matter in practice, but it's going to take some work to get there: I suspect containers can be made sufficiently secure that the attack surface size doesn't matter. But who's going to do that work? As mentioned, modern container deployment tools make use of a number of kernel security features. But there's been something of a dearth of contributions from the companies who sell container-based services. Meaningful work here would include things like:
  • Strong auditing and aggressive fuzzing of containers under realistic configurations
  • Support for meaningful nesting of Linux Security Modules in namespaces
  • Introspection of container state and (more difficult) the host OS itself in order to identify compromises
These aren't easy jobs, but they're important, and I'm hoping that the lack of obvious development in areas like this is merely a symptom of the youth of the technology rather than a lack of meaningful desire to make things better. But until things improve, it's going to be far too easy to write containers off as a "convenient, cheap, secure: choose two" tradeoff. That's not a winning strategy.

Schaller: GStreamer Conference 2014 talks online

LWN.net - Thu, 23/10/2014 - 21:29
On his blog, Christian Schaller announced the availability of videos from the recently completed GStreamer Conference. "For those of you who like me missed this years GStreamer Conference the recorded talks are now available online thanks to Ubicast. Ubicast has been a tremendous partner for GStreamer over the years making sure we have high quality talk recordings online shortly after the conference ends. So be sure to check out this years batch of great GStreamer talks."

Ferns send signals to decide what sex to be

Ars Technica - Thu, 23/10/2014 - 21:10
Lygodium japonicum, the fern in question. Doug Goldman. USDA

Sex exists because it's evolutionarily useful—it makes it easier for a population to share genetic novelties and dilute out harmful mutations. But it's also subject to all sorts of additional evolutionary constraints, from the amount of resources devoted to offspring to the challenge of ensuring that a population ends up with a useful ratio of male and female individuals.

A paper in today's issue of Science suggests that some species of fern have evolved a rather novel solution to creating a good balance between the sexes: they discuss it as a community, with the discussion taking place via chemical signals. A team of Japanese researchers show that the earliest maturing sex organs in a group of ferns will invariably develop as females. Once they do, they start producing and exporting a chemical signal.

That signal is a chemically inactivated hormone. When it's received by an immature sex organ, it gets converted to the mature form, which then influences the development of the tissue, causing it to mature as a male. The trick to all this working is that an enzyme that's essential for activating the hormone is present in immature tissue, but the gene that encodes it gets shut down as the tissue matures.

Read 2 remaining paragraphs | Comments

Enrico Zini: systemd-default-rescue

Planet Debian - Thu, 23/10/2014 - 21:06
Alternate rescue boot entry with systemd

Since systemd version 215, adding systemd.debug-shell to the kernel command line activates the debug shell on tty9 alongside the normal boot. I like the idea of that, and I'd like to have it in my standard 'rescue' entry in my grub menu.

Unfortunately, by default update-grub does not allow to customize the rescue menu entry options. I have just filed #766530 hoping for that to change.

After testing the patch I proposed for /etc/grub.d/10_linux, I now have this in my /etc/default/grub, with some satisfaction:

GRUB_CMDLINE_LINUX_RECOVERY="systemd.log_target=kmsg systemd.log_level=debug systemd.debug-shell"

Further information:

Thanks to sjoerd and uau on #debian-systemd for their help.

Science Graphic of the Week: Spectacular, Twisted Solar Eruption

Wired - Thu, 23/10/2014 - 20:31

Like many stars, the sun is prone to sudden outbursts. Erupting from the star's surface, these events sometimes sling globs of charged particles and sun-stuff in Earth's direction. If they're powerful enough, these coronal mass ejections can produce geomagnetic storms that damage satellites and disrupt power grids.

The post Science Graphic of the Week: Spectacular, Twisted Solar Eruption appeared first on WIRED.

Bentley Bolsters Its Racing Reputation With a $337K Beast

Wired - Thu, 23/10/2014 - 20:28

Most Bentleys indulge their owners with champagne buckets and neck warmers. The GT3-R does it with a “Sport” mode that won't upshift until the engine hits 6,300 RPM.

The post Bentley Bolsters Its Racing Reputation With a $337K Beast appeared first on WIRED.

T-Mobile: Our network has trouble with building walls and long distances

Ars Technica - Thu, 23/10/2014 - 20:15
T-Mobile's network won't reach into "your cave," so you'll need some Wi-Fi. T-Mobile

T-Mobile US is really looking forward to next year’s spectrum auction. Today, it doesn’t have enough low-band spectrum to match the networks of AT&T and Verizon Wireless, T-Mobile VP of Federal Regulatory Affairs Kathleen Ham wrote in a blog post.

“As our competitors well know, arming T-Mobile with low-band spectrum is a competitive game-changer, enabling our service to penetrate building walls better and travel longer distances than we can with the spectrum we have today,” Ham wrote. “Imagine a T-Mobile with even greater coverage, offering innovative Un-carrier deals to even more customers in even more places—in direct competition with the Twin Bells!”

The Federal Communications Commission plans to set aside spectrum for carriers that lack low-band frequencies (those under 1GHz) in the auction of 600MHz spectrum currently controlled by TV broadcasters. But T-Mobile says the FCC’s plan doesn’t go far enough.

Read 18 remaining paragraphs | Comments

Ubuntu 14.10 (Utopic Unicorn) released

LWN.net - Thu, 23/10/2014 - 20:13
Ubuntu has announced its latest release: 14.10 "Utopic Unicorn". As usual, it comes with versions for server, desktop, and cloud, along with multiple official "flavors": Kubuntu, Lubuntu, Mythbuntu, Ubuntu GNOME, Ubuntu Kylin, Ubuntu Studio, and Xubuntu. All of the varieties come with a 3.16 kernel and many more new features: "Ubuntu Desktop has seen incremental improvements, with newer versions of GTK and Qt, updates to major packages like Firefox and LibreOffice, and improvements to Unity, including improved High-DPI display support. Ubuntu Server 14.10 includes the Juno release of OpenStack, alongside deployment and management tools that save devops teams time when deploying distributed applications - whether on private clouds, public clouds, x86 or ARM servers, or on developer laptops. Several key server technologies, from MAAS to Ceph, have been updated to new upstream versions with a variety of new features." More information can be found in the release notes.

Facebook and Yahoo Find a New Way to Save the Web’s Lost Email Addresses

Wired - Thu, 23/10/2014 - 20:00

When Yahoo proposed a plan to reuse mothballed email addresses, a lot of people didn’t like it. WIRED’s Mat Honan called it a “very bad idea,” and with good reason.

The post Facebook and Yahoo Find a New Way to Save the Web’s Lost Email Addresses appeared first on WIRED.

Updating to iOS 8.1: Are Apple Pay, OS X ‘Continuity’ Worth the Trouble?

Wired - Thu, 23/10/2014 - 19:40

The recent releases of Apple’s latest computer and mobile operating systems – OS X Yosemite and iOS 8.1, respectively — have brought to full fruition the features announced at Apple’s September 9th Keynote address. Several features have been hailed as revolutionary, while others have been met with either trepidation or full-on skepticism. I’ve had a chance […]

The post Updating to iOS 8.1: Are Apple Pay, OS X ‘Continuity’ Worth the Trouble? appeared first on WIRED.

Research Is Just the Beginning: A Free People Must Have Open Access to the Law

EFF Breaking News - Thu, 23/10/2014 - 19:33

The open access movement has historically focused on access to scholarly research, and understandably so. The knowledge commons should be shared with and used by the public, especially when the public helped create it.

But that commons includes more than academic research. Our cultural commons is broader than what is produced by academia. Rather it includes all of the information, knowledge, and learning that shape our world. And one crucial piece of that commons is the rules by which we live. In a democratic society, people must have an unrestricted right to read, share, and comment on the law. Full stop.

But access to the law has been limited in practice. Not long ago, most court document and decisions were only available to those who had access to physical repositories. Digitization and the Internet changed that, but even today most federal court documents live behind a government paywall known as PACER. And until recently, legal decisions were difficult to access if you couldn’t afford a subscription to a commercial service, such as Westlaw, that compiles and tracks those decisions.

The good news: open access crusaders like Public.Resource.Org and the Center for Information Technology Policy have worked hard to correct the situation by publishing legal and government documents and giving citizens the tools to do so themselves.

The bad news: the specter of copyright has raised its ugly head. A group of standards-development organizations (SDOs) have banded together to sue Public.Resource.Org, accusing the site of infringing copyright by reproducing and publishing a host of safety codes that those organizations drafted and then lobbied heavily to have incorporated into law. These include crucial national standards like the national electrical codes and fire safety codes. Public access to such codes—meaning not just the ability to read them, but to publish and re-use them—can be crucial when there is an industrial accident; when there is a disaster such as Hurricane Katrina; or when a home-buyer wants to know whether her house is code-compliant. Publishing the codes online, in a readily accessible format, makes it possible for reporters and other interested citizens to not only view them easily, but also to search, excerpt, and generate new insights.

The SDOs argue that they hold a copyright on those laws because the standards began their existence in the private sector and were only later "incorporated by reference" into the law. That claim conflicts with the public interest, common sense, and the rule of law.

With help from EFF and others, Public.Resource.Org is fighting back, and the outcome of this battle will have a major impact on the public interest. If any single entity owns a copyright in the law, it can sell or ration the law, as well as make all sort of rules about when, where, and how we share it.

This Open Access Week, EFF is drawing a line in the sand. The law is part of our cultural commons, the set of works that we can all use and reuse, without restriction or oversight. Protecting that resource, our common past and present, is essential to protecting our common future. That’s why the open access movement is so important, and we’re proud to be part of it.

Between October 20 and 26, EFF is celebrating Open Access Week alongside dozens of organizations from around the world. This is a week to acknowledge the wide-ranging benefits of enabling open access to information and research—as well as exploring the dangerous costs of keeping knowledge locked behind publisher paywalls. We'll be posting on our blog every day about various aspects of the open access movement. Go here to find out how you can take part and to read the other Deeplinks published this week.

Related Issues: Fair Use and Intellectual Property: Defending the BalanceOpen AccessInternational
Share this:   ||  Join EFF

Gunnar Wolf: Listadmin — *YES*

Planet Debian - Thu, 23/10/2014 - 19:05

Petter posted yesterday about Listadmin, the quick way to moderate mailman lists.

Petter: THANKS.

I am a fan of automatization. But, yes, I had never thouguht of doing this. Why? Don't know. But this is way easier than using the Web interface for Mailman:

$ listadmin fetching data for conoc_des@my.example.org ... nothing in queue fetching data for des_polit_pub@my.example.org ... nothing in queue fetching data for econ_apl@my.example.org ... nothing in queue fetching data for educ_ciencia_tec@my.example.org ... nothing in queue fetching data for est_hacend_sec_pub@my.example.org ... [1/1] ============== est_hacend_sec_pub@my.example.org ====== From: sender@example.org Subject: Invitación al Taller Insumo Producto Reason: El cuerpo del mensaje es demasiado grande: 777499 Spam? 0 Approve/Reject/Discard/Skip/view Body/Full/jump #/Undo/Help/Quit ? a Submit changes? [yes] fetching data for fiscal_fin@my.example.org ... nothing in queue fetching data for historia@my.example.org ... nothing in queue fetching data for industrial@my.example.org ... nothing in queue fetching data for medio_amb@my.example.org ... nothing in queue fetching data for mundial@my.example.org ... nothing in queue fetching data for pol_des@my.example.org ... nothing in queue fetching data for sec_ener@my.example.org ... nothing in queue fetching data for sec_prim@my.example.org ... nothing in queue fetching data for trab_tec@my.example.org ... nothing in queue fetching data for urb_reg@my.example.org ... nothing in queue fetching data for global@my.example.org ... nothing in queue

I don't know how in many years of managing several mailing lists I never thought about this! I'm echoing this, as I know several of my readers run mailman as well, and might not be following Planet Debian.

A huge tsunami in Hawaii’s past warns of future risk

Ars Technica - Thu, 23/10/2014 - 19:05
Simulated tsunamis for earthquakes in several locations. Rhett Butler

Surfers love Hawaii’s waves, and many dream of catching “the big one.” For most people living in coastal areas vulnerable to tsunamis, though, “the big one” is a bad dream. We’ve seen many devastating events over the years, but our memory is not so long that Mother Nature can’t surprise us. The 2011 tsunami in Japan testified to that.

In 2001, sediment from a past tsunami was found in a sinkhole on the southeast side of the Hawaiian island of Kaua’i. The mouth of that sinkhole is about a hundred meters from the shoreline—and over seven meters above sea level. The largest tsunami measured in the area had been three meters, courtesy of Chile’s monstrous magnitude 9.55 earthquake in 1960. Could it be that an event was big enough to send tsunami waves over seven meters high to Hawaii in the past?

Researchers Rhett Butler, David Burney, and David Walsh simulated a variety of earthquakes around the Pacific to find out. They used a model that simulates the spread of tsunami waves, creating some virtual magnitude 9.0 to 9.6 earthquakes from Alaska to Kamchatka.

Read 6 remaining paragraphs | Comments

Syndicate content