Collisions in these hashes are expected and managed by the programming framework in a reliable way when not being abused.
It is known that an attacker who understands the values used in your hashing algorithm could pre-compute a set of values that result in all hashes being the same. Comparing these hashes becomes a quadratic function which can create a very heavy load on the web server.
An example given showed how submitting approximately two megabytes of values that all compute to the same hash causes the web server to do more than 40 billion string comparisons.
During the talk they performed a denial of service attack against an Apache Tomcat server, which is a commonly used java servlet container for hosting web pages.
They sent the server some pre-computed hash collisions and showed how it used 100% of their processor for the entire talk.
They explained that the proper solution to the problem is for the developers of the vulnerable programming languages to randomize the key used when computing hashes. This would prevent an attacker from being able to pre-compute the collisions.
Perl was updated to fix this problem in version 5.8.1, which was released in September of 2003. For some reason most of the other languages did not take the cue from Perl and are still vulnerable to these attacks.
Without fixing the hashing functions in the languages themselves there are three mitigation techniques available to website operators.
Reduce the length of parameters that can posted.
Reduce the number of parameters accepted by the web application framework.
Limit the amount of CPU time that any given thread is allowed to run.
Microsoft has released an advisory for ASP.NET customers with advice on mitigation until they are able to ship a more permanent fix.
It may be possible to configure web application firewalls and other network security devices to limit the impact of an attack as well, it would certainly be worth your time to consult with your security vendors to see if they can help.