In the 1982 House Hearing on Home Recording of Copyrighted Works Jack valenti said:
I know of no technological device at this time that would bar taping in the home and if it did exist, it would only be a matter of days before the Japanese manufacturers would have an override piece of equipment on their machine and you would start from ground zero again.
So why is he trying to force such a thing now?
Friday, 31 May 2002
MacOPINION : Matthew Ruben | Celine Dion Killed My iMac!
Detailed article on how the Key2Audio protection racket works:
MacOPINION : Matthew Ruben | Celine Dion Killed My iMac!
Interesting conspiracy theory too:
...we see piracy continuing more or less undisturbed, with fair use being seriously disrupted.
It would be paranoid and silly to think that Sony and other record companies would want to destroy fair use just for the heck of it. There has to be a method to their madness, yes?
[...]
Key2Audio is the first step in a dreadful double perversion of Fair Use. The first perversion is the idea that by making a copy of music for yourself, you are depriving the copyright holder of the ability to obtain revenue from selling you additional copies of the same music. The second, linked, perversion is that by destroying your ability to exercise fair use, the record company extends its copyright power beyond the content (the music) to the delivery medium (the CD).
MacOPINION : Matthew Ruben | Celine Dion Killed My iMac!
Interesting conspiracy theory too:
...we see piracy continuing more or less undisturbed, with fair use being seriously disrupted.
It would be paranoid and silly to think that Sony and other record companies would want to destroy fair use just for the heck of it. There has to be a method to their madness, yes?
[...]
Key2Audio is the first step in a dreadful double perversion of Fair Use. The first perversion is the idea that by making a copy of music for yourself, you are depriving the copyright holder of the ability to obtain revenue from selling you additional copies of the same music. The second, linked, perversion is that by destroying your ability to exercise fair use, the record company extends its copyright power beyond the content (the music) to the delivery medium (the CD).
Wednesday, 29 May 2002
Fighting Terrorism with Google?
A couple of posts on Dave Farber's 'Interesting people' list set this thought off.
First this one
WASHINGTON - An experimental computer program designed to analyze
intelligence gave U.S. Special Forces a mission recommendation in 2000 that
some say could have prevented the Sept. 11 attacks.
In truth, though, the CIA does study foreign press, but before Sept. 11 made
little use of computers to collect and analyze classified and unclassified
information together, which is what Special Forces began doing in 1997,
enabling them to get a read-out on the terror cells.
This smells of fish to me. Computers can't analyze intelligence (unless a lot of AI breakthroughs have happened in secret). People can analyze intelligence. Computers can aggregate and help them link and inform each other.
If the 'intelligence community' worked with the kind of hyperlinking tools that the rest of us use to help make Google the best way to find anything, maybe they'd have got somewhere.
The idea of how this works isn't hard to grasp - my 7-year-old son got it straight away - but it is hard to map to an insufficiently public space, like an intranet or (in particular) a hierarchically organised intelligence network that is more concerned with 'need to know' and secrecy classification.
Today, David Reed suggested that we harness the public:
An open/transparent world reduces imagination of potential threats. It also increases the reliability of assessing actual threats.
Which tends to synergize with Moynihan's sound-bite: "Secrecy is for losers".
So here's a radical proposal: openly publish most (if not all) of the information collected by the CIA, NSA, ... to public inspection. Figure out how to avoid compromising sources where needed, but get all of it out, efficiently. Use the Internet, because it scales, rather than TV, print, and Radio, which don't.
This will enable all of civil society to become outsourcers of the costly mundane details of threat management, leaving the difficult and specialized functions to experts with specialized resources.
In this world, terrorist's ability to use the leverage of "unknown" threats and rampant paranoia of their targets to amplify their meager efforts would be dramatically reduced.
This would also mean that the emergent properties of Google indexing millions of individual human's links could come into play, as discussed in Cory's article 'How I learned to stop worrying and love the Panopticon'
AltaVista for them, Google for us
But what do they do with all of that data that they collect? Filter it for keywords? Fat chance. The volume of false positives (e.g., people talking about child pornography who aren't child pornographers) far exceeds the volume of actual criminal activity. Even creaky old Lycos gave up on plain-old keyword matching a long, long time ago.
Maybe they manually check it. After all, that approach worked for Yahoo, right? Oh, right, it didn't work. Scratch that.
Then they must use some hybrid approach: human editors and AI (Artificial Intelligence or Almost Implemented, take your pick) working in concert to tweeze out the most relevant material as quickly and efficiently as possible.
Right. AltaVista.
Poor bastards.
A couple of posts on Dave Farber's 'Interesting people' list set this thought off.
First this one
WASHINGTON - An experimental computer program designed to analyze
intelligence gave U.S. Special Forces a mission recommendation in 2000 that
some say could have prevented the Sept. 11 attacks.
In truth, though, the CIA does study foreign press, but before Sept. 11 made
little use of computers to collect and analyze classified and unclassified
information together, which is what Special Forces began doing in 1997,
enabling them to get a read-out on the terror cells.
This smells of fish to me. Computers can't analyze intelligence (unless a lot of AI breakthroughs have happened in secret). People can analyze intelligence. Computers can aggregate and help them link and inform each other.
If the 'intelligence community' worked with the kind of hyperlinking tools that the rest of us use to help make Google the best way to find anything, maybe they'd have got somewhere.
The idea of how this works isn't hard to grasp - my 7-year-old son got it straight away - but it is hard to map to an insufficiently public space, like an intranet or (in particular) a hierarchically organised intelligence network that is more concerned with 'need to know' and secrecy classification.
Today, David Reed suggested that we harness the public:
An open/transparent world reduces imagination of potential threats. It also increases the reliability of assessing actual threats.
Which tends to synergize with Moynihan's sound-bite: "Secrecy is for losers".
So here's a radical proposal: openly publish most (if not all) of the information collected by the CIA, NSA, ... to public inspection. Figure out how to avoid compromising sources where needed, but get all of it out, efficiently. Use the Internet, because it scales, rather than TV, print, and Radio, which don't.
This will enable all of civil society to become outsourcers of the costly mundane details of threat management, leaving the difficult and specialized functions to experts with specialized resources.
In this world, terrorist's ability to use the leverage of "unknown" threats and rampant paranoia of their targets to amplify their meager efforts would be dramatically reduced.
This would also mean that the emergent properties of Google indexing millions of individual human's links could come into play, as discussed in Cory's article 'How I learned to stop worrying and love the Panopticon'
AltaVista for them, Google for us
But what do they do with all of that data that they collect? Filter it for keywords? Fat chance. The volume of false positives (e.g., people talking about child pornography who aren't child pornographers) far exceeds the volume of actual criminal activity. Even creaky old Lycos gave up on plain-old keyword matching a long, long time ago.
Maybe they manually check it. After all, that approach worked for Yahoo, right? Oh, right, it didn't work. Scratch that.
Then they must use some hybrid approach: human editors and AI (Artificial Intelligence or Almost Implemented, take your pick) working in concert to tweeze out the most relevant material as quickly and efficiently as possible.
Right. AltaVista.
Poor bastards.
BBspot - Copies of Spider-Man 2 Available on the Web
BBspot - Copies of Spider-Man 2 Available on the Web
Those darned pirates get smarter all the time...
Those darned pirates get smarter all the time...
The EFF parody 'The Mickey Mouse Club' to fight the CBDTPA.
Cory explains what parody is.
The phrases 'Mickey Mouse Copy Protection' and 'Mickey Mouse Computer' need to enter the language in this context - as in 'Do you want a Mickey Mouse computer that stops you making music?'
Cory explains what parody is.
The phrases 'Mickey Mouse Copy Protection' and 'Mickey Mouse Computer' need to enter the language in this context - as in 'Do you want a Mickey Mouse computer that stops you making music?'
Tuesday, 28 May 2002
Dave is complaining about how uncomfortable outdoors is. His basic problem is that he lives on the wrong coast. We had a little burst of humidity here yesterday, amid the standard perfect 75�F day with a light breeze, and it reminded me of what is wrong with living in the kind of climate zone where that is expected.
Virginia Postrel developed this into a theory of why Silicon Valley beats Boston for innovation...
Virginia Postrel developed this into a theory of why Silicon Valley beats Boston for innovation...
The US Senate Committee on the Judiciary is collecting comments on the CBTPA
here are mine:
The CBDTPA is based on 3 delusions.
1.That computers can be prevented from copying.
This is wrong. The most basic definition of a computer, described by Alan Turing in 1936, is a device that reads and copies symbols, and modifies its internal state. He showed that anything capable of doing these is a digital computer. This bill would thus outlaw any Universal Turing machine. This includes the DNA copying mechanism in your cells. Stephen Wolfram has just shown that you can create a universal computer using 2 internal states and 5 symbols. Any Universal computer can emulate any other one, so that the software running on it does not know that it is running on the emulation. Consequently, all copy protection can be subverted in this manner. The only way the stated goals of this bill can be achieved is by outlawing computation itself.
2. That copyright law gives a monopoly over copying.
A better reading of the law gives a monopoly over redistribution to others, and even this is mitigated by fair use. Preventing copying at source is a huge over-reach, and unconstitutional by any reading.
3. That making something uncopyable increases its value.
This is the key mistake from a business point of view. People will pay more for content in a useful form and content that can be copied and transformed is more useful. Ignoring the logical, moral and constitutional issues for a second, this is crazy from a business point of view.
send yours too
here are mine:
The CBDTPA is based on 3 delusions.
1.That computers can be prevented from copying.
This is wrong. The most basic definition of a computer, described by Alan Turing in 1936, is a device that reads and copies symbols, and modifies its internal state. He showed that anything capable of doing these is a digital computer. This bill would thus outlaw any Universal Turing machine. This includes the DNA copying mechanism in your cells. Stephen Wolfram has just shown that you can create a universal computer using 2 internal states and 5 symbols. Any Universal computer can emulate any other one, so that the software running on it does not know that it is running on the emulation. Consequently, all copy protection can be subverted in this manner. The only way the stated goals of this bill can be achieved is by outlawing computation itself.
2. That copyright law gives a monopoly over copying.
A better reading of the law gives a monopoly over redistribution to others, and even this is mitigated by fair use. Preventing copying at source is a huge over-reach, and unconstitutional by any reading.
3. That making something uncopyable increases its value.
This is the key mistake from a business point of view. People will pay more for content in a useful form and content that can be copied and transformed is more useful. Ignoring the logical, moral and constitutional issues for a second, this is crazy from a business point of view.
send yours too
Monday, 27 May 2002
John Dvorak gets it on the DMCA:
I have to ask, does anyone in power care at all about the public and its needs? For example, when the copyright laws were first written, the idea was that certain creations would belong to the creator exclusively for a limited period of time and then pass into the public domain to benefit society as a whole. New laws see things differently, though. Now, society as a whole is meaningless. The vested interests of a few already rich individuals and corporations dominate the thinking surrounding copyrights.
I have to ask, does anyone in power care at all about the public and its needs? For example, when the copyright laws were first written, the idea was that certain creations would belong to the creator exclusively for a limited period of time and then pass into the public domain to benefit society as a whole. New laws see things differently, though. Now, society as a whole is meaningless. The vested interests of a few already rich individuals and corporations dominate the thinking surrounding copyrights.
Sunday, 26 May 2002
Alastair Cooke mourns Peter Bauer, notes China's claims that the US is closer to Communism than China is, and China needs to get Capitalist first... and reminds us how broke California is.
A great Defence of Lessig by Ernie the Attorney leads me to Taking Copy out of Copyright which expresses clearly in legal terms what I have thought for some time - it is not copying that harms content owners, but re-distribution. The 'content' industry count each copy as a foregone full-price sale, which is odd economics.
The idea that they can prevent all copying by legally enforced technical means is wrong. What they should be doing is focusing on those who are making money from parallel distribution.
A thoughtful comment by 'pyramid termite' on the slashdot discussion thread:
The reason why organizations such as the mass media and the companies that distribute art were able to lock out live performers is that the "public" was reinvented -- instead of the "public" being anyone a performer could possibly meet, the public became anyone a mass media organization could reach by TV, movies, radio, print, etc.
Now the public is being reinvented again and is becoming anyone the artist or a fan of the art can communicate with. What we are seeing is not simply a war over copyright - it's a war over what the public will be and who will have the right to communicate with it. The mass media would prefer to have a public that remains large with easily controllable desires and means of distribution to it. The new public wants to control its own desires and means of distribution; it wants to be the artist, the publisher and the audience.
There can't be laws to enforce the old mass media copyrights without enforcing the old, outdated mass media model. This is not just a battle over who has the right to distribute a work but who has the right to distribute any work and who can create a public to communicate with. The performers would like to have their public to be anyone they communicate with - the mass media moguls are calling for laws against the technologies that would make this communication impossible.
The idea that they can prevent all copying by legally enforced technical means is wrong. What they should be doing is focusing on those who are making money from parallel distribution.
A thoughtful comment by 'pyramid termite' on the slashdot discussion thread:
The reason why organizations such as the mass media and the companies that distribute art were able to lock out live performers is that the "public" was reinvented -- instead of the "public" being anyone a performer could possibly meet, the public became anyone a mass media organization could reach by TV, movies, radio, print, etc.
Now the public is being reinvented again and is becoming anyone the artist or a fan of the art can communicate with. What we are seeing is not simply a war over copyright - it's a war over what the public will be and who will have the right to communicate with it. The mass media would prefer to have a public that remains large with easily controllable desires and means of distribution to it. The new public wants to control its own desires and means of distribution; it wants to be the artist, the publisher and the audience.
There can't be laws to enforce the old mass media copyrights without enforcing the old, outdated mass media model. This is not just a battle over who has the right to distribute a work but who has the right to distribute any work and who can create a public to communicate with. The performers would like to have their public to be anyone they communicate with - the mass media moguls are calling for laws against the technologies that would make this communication impossible.
Saturday, 25 May 2002
2 more quick points on connectivity. The FCC just lost on appeal its regulations forcing Regional Bells to open their wires to alternative DSL suppliers:
"The commission ... completely failed to consider the relevance of competition in broadband services coming from cable (and to a lesser extent satellite)," Judge Stephen Williams wrote.
So here's a thought - can the FCC instead mandate that the Telcos have to give access to their physical poles and conduits , so that someone else can run fibre through them? (Peter Cochrane linked below explains why the telco's won't ever do this). Or would this be something that needs to happen on a local basis?
Secondly, on the 'commodity' argument, Andrew Odlyzko's 'The History of Communications' is a must read - it shows how communications become fixed-price commodities over time, covering everything from postage to the net.
He also has a detailed commentary on Roxann Googin's predictions A couple of snips:
Is the "first mile" a natural monopoly? That is what the failure of the CLECs has led many observers to conclude. Yet there are some contrary indicators. After all, most households do have three separate communication systems, the copper-based one from their ILEC, a coax-based one from their cable TV provider, and a cell phone from a wireless carrier. Thus a much deeper look is needed to understand what is going on, far beyond the scope of this note. A key factor, though, is that change is slow but inevitable. Hence a static analysis of technology choices, without taking account how quickly consumer are likely to move, is bound to be inadequate.
[...]
Policy makers who are interested in promoting competition could help this move along by forcing those ILECs that have not yet done so to completely sever their ties with cellular carriers. This would be a much simpler move, both technically and politically, than the separation of wireline industry that is widely discussed.
Competition from cellular carriers for voice is likely to force ILECs to concentrate on exploiting their natural advantage in bandwidth, and to emphasize Internet access. (Note again the UK statistics, where internet access traffic on the voice network is fast approaching that of voice itself, especially since the latter figure includes some modem and fax traffic.) This will likely also force them to emphasize broadband, as a way to segment the market, and to create a natural progression path for their customers, towards higher and higher bandwidth.
[...]
The most promising area for [long distance carriers] is to manage networks that are largely owned by their customers. This will be a huge change, but the IBM example shows that it possible, and also that there is time to do it. The ILECs might be tempted to follow in this same direction, but are less likely to succeed, and may have to resign themselves to operating at lower levels of the networking hierarchy. However, there is likely to be enough opportunity for them even there to thrive.
"The commission ... completely failed to consider the relevance of competition in broadband services coming from cable (and to a lesser extent satellite)," Judge Stephen Williams wrote.
So here's a thought - can the FCC instead mandate that the Telcos have to give access to their physical poles and conduits , so that someone else can run fibre through them? (Peter Cochrane linked below explains why the telco's won't ever do this). Or would this be something that needs to happen on a local basis?
Secondly, on the 'commodity' argument, Andrew Odlyzko's 'The History of Communications' is a must read - it shows how communications become fixed-price commodities over time, covering everything from postage to the net.
He also has a detailed commentary on Roxann Googin's predictions A couple of snips:
Is the "first mile" a natural monopoly? That is what the failure of the CLECs has led many observers to conclude. Yet there are some contrary indicators. After all, most households do have three separate communication systems, the copper-based one from their ILEC, a coax-based one from their cable TV provider, and a cell phone from a wireless carrier. Thus a much deeper look is needed to understand what is going on, far beyond the scope of this note. A key factor, though, is that change is slow but inevitable. Hence a static analysis of technology choices, without taking account how quickly consumer are likely to move, is bound to be inadequate.
[...]
Policy makers who are interested in promoting competition could help this move along by forcing those ILECs that have not yet done so to completely sever their ties with cellular carriers. This would be a much simpler move, both technically and politically, than the separation of wireline industry that is widely discussed.
Competition from cellular carriers for voice is likely to force ILECs to concentrate on exploiting their natural advantage in bandwidth, and to emphasize Internet access. (Note again the UK statistics, where internet access traffic on the voice network is fast approaching that of voice itself, especially since the latter figure includes some modem and fax traffic.) This will likely also force them to emphasize broadband, as a way to segment the market, and to create a natural progression path for their customers, towards higher and higher bandwidth.
[...]
The most promising area for [long distance carriers] is to manage networks that are largely owned by their customers. This will be a huge change, but the IBM example shows that it possible, and also that there is time to do it. The ILECs might be tempted to follow in this same direction, but are less likely to succeed, and may have to resign themselves to operating at lower levels of the networking hierarchy. However, there is likely to be enough opportunity for them even there to thrive.
Connectivity Convergence
I have to admit that reading Dave Weinberger's live coverage of Connectivity 2002 I wished I coud have been there with Stuart (as he has explained a lot of networking subtleties to me over the years).
However, the conversation continues through weblogs. Let me try and round up a few points that others have made, helping me to clarify my thoughts.
First, lets split what seemed to be two alternatives into three. The ideal that we're all after is a commoditized 'stupid' packet switching network, with intelligence at the ends in applications. There are two alternative paradigms that are fighting aginst this, and we need to separate them as they attack from different directions (though often in concert).
The first alternative is 'circuit switching' instead of packet switching. This is the bad solution that gets reinvented continuously by people who like thinking about wires. The notion is that there is a continuous connection between two endpoints that is guaranteed to be unbroken. This requires a much 'smarter' (and hence far more costly to implement and maintain) network that keeps data flowing between two nodes and doesn't fail.
What this really means is that when it does fail, it can't cope at all - you get hung up on. Examples of this are ATM (instead of TCP), PPPoE (instead of TCP) and 3G wireless (instead of 802.11) and BlueTooth (instead of 802.11).
Cheshire's laws of Network Dynamics summarize it this way:
For every Network Service there's an equal and opposite Network Disservice
Nothing in networking comes for free, which is a fact many people seem to forget. Any time a network technology offers "guaranteed reliability" and other similar properties, you should ask what it is going to cost you, because it is going to cost you. It may cost you in terms of money, in terms of lower throughput (bandwidth), or in terms of higher delay, but one way or another it is going to cost you something. Nothing comes for free.
1. A guaranteed network service guarantees to be a low quality overpriced service
2. For every guarantee there's a corresponding refusal
3. For every Network Connection there's a corresponding Network Disconnection
Or more succinctly in The ATM Paradox
So I continue to contend (contra Roxann Googinas cited by Dave) that running this kind of network is a bad business to be in compared to a commodity connectivity one, as its existing fixed costs that can be supported by expensive voice calls are going away can be replaced by a bigger fatter packet-switching network. Peter Cochrane explains how the Telcos messed up their opportuinity, and summarizes this way:
TelCos will never deliver wide-band communications. It is not in their interests to do so; it isn't in their business minds or models. They are into call minutes and billing systems. They are old, gray and don't get it, and they don't intend getting it.
2. The cable companies are slightly better, but have a broadcast mindset, where wide-band is an add-on, a kluge, and not a primary business or technology.
3. Both (1) and (2) have missed their opportunity, wasted time and money on a vast scale, and are now going bust. Five years ago they could have rolled fiber into the local loop, they had the money and the people back then - now they have neither.
So lets get onto bad paradigm 2 - the broadcast mindset of networks optimised for 'content delivery'.
The 'content' industry is really several different pseudo-marketplaces joined together in odd ways through vertical integration. At one end is the VC-like fashion business of choosing which movie or pop singer to invest in.
Then there is the long chain of distribution selling the resulting works, often controlled by the Studio or Label.
Finally there is the weird inverted marketplace of broadcasting where the audience is sold to advertisers with the 'content' as bait, but then the 'content' used gets manipulated to promote sales through 'payola'.
Because of the tangled nature of these shenanigans, it is never clear what the business really is, but it is this group that presents the biggest threat to an open network, as they would like to impose huge restrictions to protect themselves from competition.
Doc points to an article attacking Lessig in a crude and formulaic way, assuming that all creativity comes from the centre, and advocating DRM as providing new service to consumers.
Doc then slightly mis-states Lessig's argument in defence.
Lessig isn't saying the net has 'natural' laws; he is saying that the current architecture of the net (the end-to-end, stupid architecture) has these kind of characteristics, but that programmers, not poets, are now the unacknowledged legislators of the world.
He explains well that law, code, norms and markets influence behaviour, and appreciates that all of these can be modified, and urges those modfying them to build in and expand the values of openness we currently enjoy.
Finally, I liked David Reed's dicussion of options for funding - I've been reading 'Extreme Programming explained' this week too, and it expresses much the same idea, but applied to coding choices.
I have to admit that reading Dave Weinberger's live coverage of Connectivity 2002 I wished I coud have been there with Stuart (as he has explained a lot of networking subtleties to me over the years).
However, the conversation continues through weblogs. Let me try and round up a few points that others have made, helping me to clarify my thoughts.
First, lets split what seemed to be two alternatives into three. The ideal that we're all after is a commoditized 'stupid' packet switching network, with intelligence at the ends in applications. There are two alternative paradigms that are fighting aginst this, and we need to separate them as they attack from different directions (though often in concert).
The first alternative is 'circuit switching' instead of packet switching. This is the bad solution that gets reinvented continuously by people who like thinking about wires. The notion is that there is a continuous connection between two endpoints that is guaranteed to be unbroken. This requires a much 'smarter' (and hence far more costly to implement and maintain) network that keeps data flowing between two nodes and doesn't fail.
What this really means is that when it does fail, it can't cope at all - you get hung up on. Examples of this are ATM (instead of TCP), PPPoE (instead of TCP) and 3G wireless (instead of 802.11) and BlueTooth (instead of 802.11).
Cheshire's laws of Network Dynamics summarize it this way:
For every Network Service there's an equal and opposite Network Disservice
Nothing in networking comes for free, which is a fact many people seem to forget. Any time a network technology offers "guaranteed reliability" and other similar properties, you should ask what it is going to cost you, because it is going to cost you. It may cost you in terms of money, in terms of lower throughput (bandwidth), or in terms of higher delay, but one way or another it is going to cost you something. Nothing comes for free.
1. A guaranteed network service guarantees to be a low quality overpriced service
2. For every guarantee there's a corresponding refusal
3. For every Network Connection there's a corresponding Network Disconnection
Or more succinctly in The ATM Paradox
So I continue to contend (contra Roxann Googinas cited by Dave) that running this kind of network is a bad business to be in compared to a commodity connectivity one, as its existing fixed costs that can be supported by expensive voice calls are going away can be replaced by a bigger fatter packet-switching network. Peter Cochrane explains how the Telcos messed up their opportuinity, and summarizes this way:
TelCos will never deliver wide-band communications. It is not in their interests to do so; it isn't in their business minds or models. They are into call minutes and billing systems. They are old, gray and don't get it, and they don't intend getting it.
2. The cable companies are slightly better, but have a broadcast mindset, where wide-band is an add-on, a kluge, and not a primary business or technology.
3. Both (1) and (2) have missed their opportunity, wasted time and money on a vast scale, and are now going bust. Five years ago they could have rolled fiber into the local loop, they had the money and the people back then - now they have neither.
So lets get onto bad paradigm 2 - the broadcast mindset of networks optimised for 'content delivery'.
The 'content' industry is really several different pseudo-marketplaces joined together in odd ways through vertical integration. At one end is the VC-like fashion business of choosing which movie or pop singer to invest in.
Then there is the long chain of distribution selling the resulting works, often controlled by the Studio or Label.
Finally there is the weird inverted marketplace of broadcasting where the audience is sold to advertisers with the 'content' as bait, but then the 'content' used gets manipulated to promote sales through 'payola'.
Because of the tangled nature of these shenanigans, it is never clear what the business really is, but it is this group that presents the biggest threat to an open network, as they would like to impose huge restrictions to protect themselves from competition.
Doc points to an article attacking Lessig in a crude and formulaic way, assuming that all creativity comes from the centre, and advocating DRM as providing new service to consumers.
Doc then slightly mis-states Lessig's argument in defence.
Lessig isn't saying the net has 'natural' laws; he is saying that the current architecture of the net (the end-to-end, stupid architecture) has these kind of characteristics, but that programmers, not poets, are now the unacknowledged legislators of the world.
He explains well that law, code, norms and markets influence behaviour, and appreciates that all of these can be modified, and urges those modfying them to build in and expand the values of openness we currently enjoy.
Finally, I liked David Reed's dicussion of options for funding - I've been reading 'Extreme Programming explained' this week too, and it expresses much the same idea, but applied to coding choices.
Friday, 24 May 2002
Einstein quotes
for Akma:
Things should be made as simple as possible, but not any simpler.
For Dave:
The wireless telegraph is not difficult to understand. The ordinary telegraph is like a very long cat. You pull the tail in New York, and it meows in Los Angeles. The wireless is the same, only without the cat.
for Akma:
Things should be made as simple as possible, but not any simpler.
For Dave:
The wireless telegraph is not difficult to understand. The ordinary telegraph is like a very long cat. You pull the tail in New York, and it meows in Los Angeles. The wireless is the same, only without the cat.
Subscribe to:
Posts (Atom)