The plan, which is being advocated by Communications Minister Stephen Conroy, a member of the democratic Labor party, seeks to protect children from online violence, pornography and other inappropriate material by requiring Internet Service Providers to provide homes and schools with filtered content.
Critics, however, fear that the censorship measures will lead Australia down the same path as Singapore and China, where "objectionable" websites are routinely blocked and stiff prison sentences handed out to those accused of Internet crimes that often simply involve the espousal of viewpoints that the government disagrees with.
"Labor makes no apologies to those that argue that any regulation of the Internet is like going down the Chinese road," Conroy said.
Australian Privacy Foundation chair Roger Clarke claims, however, that the filtering plan would be ineffective and induce unintended consequences and that the responsibility for protecting children on the Internet should rightfully fall on parents and guardians.
"It's not the government's business to control information flows," Clarke said. "That's the kind of thing that goes on in oppressive countries, in authoritarian countries. That's not what the government is there to do."
According to the Internet Industry Association, ISPs are currently providing free filters, calling into question the need for legislation to mandate the practice.
"At the moment we don't know what the extent of it will be, what it will cost, and whether it will set a precedent for other changes," spokesman Peter Coroneos said.
One of the major criticisms is that service will slow down due to the filtering process.
"There are people who are going to make all sorts of statements about the impact on the speed," Conroy said. "But that is why we are engaged constructively with the sector, engaging in trials to find a way to implement this in the best possible way and to work with the sector."
The Australian plan, which relies on the "CleanFeed" technology developed by BT, may not even be a reliable means of filtering content.
"At first sight, it's an effective and precise method of blocking unacceptable content," said Richard Clayton of the University of Cambridge's Computer Laboratory. "But there are a number of issues to address as soon as one assumes that content providers or consumers might make serious attempts to get around it."