Me and a friend have been talking about this a little while ago and I wanted to know what you guys thought about that.
Basically, I am interested if there could ever be enough political backup for some sort of network that was simply mirroring somehow valuable content that has passed a certain time frame: old movies, older versions of software, etc.
Right now, the whole p2p thing has largely been a thiefs vs the big bad media corporate whores kind of thing and most people conducting their p2p bussiness to acquire worthwhile files do this in a legal grey zone that will turn into red every once in a while when the MPAA or related lawyer&aids-filled institutions strike down with a hammer.
I don't see why this have to be the way it is. I think an open network for stuff that isn't even sold in significant numbers anymore and for which the original producers don't see a dime anymore anyway would be a good idea to fill the pool of legally available files and at the same time give the public opinion a good example on how to effectively use p2p technology while still maintaining some sort of social consesus.
Of course, the chances for this happening anytime soon are slim, copyrights and patents held by the major companies are held onto like they're the ultimate treasure with which to restrict other people's bussiness. The main laws concerning this are probably still paying off for some parties here, otherwise they wouldn't exist. Still, I'd like to see some change and I don't really want it to end up with trusted computing killing all of our little freeloading pleasures in the next decades.
A P2P net for just unlicenced stuff sounds a good idea but unlikely to get very popular...
Though, following P2P over the past few years, it's very obvious extreme encryption is the main technology push forwards. everytime a P2P net net is shut down/sued into oblivion usually a new higher encrypted version is born. (which is evolution in action, every time a net is shut down a harder to kill net is born... the recording/entertainment industry could potentially own P2P by now, but by attacking it has made it only made it molre resistant)
things like freenet and i2p look like they will be the future of P2P (google them, pretty much military-grade encrypted networks)
I know Freenet and i2p, but encryption and diffusion add up a lot to a) the greyness of the legal zone and b) the general slowness of the network. What I envision is a network that's open, fast and free (though it would be worthwhile to think about that development and support could be sponsored by those who let go off their copyright claims).
BitTorrent seems, to a certain extent, to be the most-used network (if you can call it a single "network") for legal P2P downloads. You just stick to sites and trackers that offer legit or quasi-legit downloads, like archive.org or fansub trackers, and you're golden. That being said, wherever there is a network that allows people to share their own files without moderation, there will be piracy. There's nothing you can do about that.
Fansubs aren't legit in any way, of course. The best bet would be running a closed BitTorrent tracker, or some other P2P that rates speed and ease of use over trying anything that attempts to reduce privacy.
De jure fansubs aren't legal, but de facto I think most people would agree they're in a bit of a gray area. Heck, I think so, and partake in them from time to time, and I'm a big supporter of people and companies being able to enforce their copyrights.
Fansubs are a bad example of what would be available in the kind of network I was talking about, though.
The Internet Archive has a ton of stuff that's gone into the public domain. (remember kids, copyrights aren't supposed to be forever!)
Yes, but that's http stuff, isn't it? I also understand that their servers are getting quite hammered these days.
The Internet Archive doesn't archive No Robots sites.
Very annoying as many Japanese sites' software have the No Robots enabled by default.
Plus, often the sites archived are full of holes. The IA must have used a very low delay for the timeouts. Or they went to sites that automatically block websuckers after x elements gotten. One has to do at least a second try before to get it all.
> Plus, often the sites archived are full of holes.
I think given the huge volume of content they have to process, they're probably using some weird "priority" system, so unpopular sites might only have their first page archived, or something like that.
∧∧,..,、、.,、,、、..,_ /i
;'゚Д゚、、:、.:、:, :,.: ::`゙:.:゙:`''':,'.´ -‐i
'、;: ...: ,:. :.、.:',.: .:: _;.;;..; :..‐'゙  ̄  ̄
`"∪∪''`゙ ∪∪´´
Posting in this thread to let you all know that google trawls this site regularly (it grabbed some stuff just the other day in fact), and that alexa/webarchive have gotten word, but it might take a while before they start archiving shit.
w00t.
Like >>8 says, really, this is not supposed to be a problem in the first place. If copyright terms had a reasonable length of 10 years or so instead of the obscene 100+ years things get now, all sorts of problems would be eliminated.
Some domains are highly volatile, like software and the internet. Others are less so, like dead-tree media. I'd say there should be different copyright durations depending on which (five for software, thirty for books?) but I just know someone is going to try and get around it by printing hexadecimal to a book.
The lawyers would love it.