I don't quite understand your question. Are you talking about null bytes that are being submitted in a GET or POST request to your CGI app? Is your app running into some particular problem involving null bytes?
I don't know for sure, but it wouldn't surprise me if there were some situations where actual null bytes are to be expected as part of the data going from client to app or from app to client; assuming such cases exist, it would seem to be a bad idea to filter them out, because they're probably part of some compressed or other binary stream, and such a stream would become unusable if certain bytes were filtered out.
Apart from such cases (if indeed there are any), I'd expect null bytes to be "encoded" in some way for transmission between server and client (e.g. as a three-character string "%00"), and again, I could imagine (but haven't seen) cases where this might be appropriate or necessary for some purpose, so again it's likely to be a mistake to filter them out.
Please explain what you're doing that involves null bytes in your CGI usage, and say more about the nature of the problem you're trying to solve. | [reply] |
I thought it would add some security to the program I'm making. As it is now I just filter param values as I use them with very strict patterns. Works good and there are no problems. That seems to be the "Best Practice" when dealing with param values.
I plan to release the final version to the public and the more I think about adding any security filter globally to the param's. I realize it could actually trick a developer into thinking they don't need to check the param's for issues.
So "no" on filtering null bytes is the answer I'm leaning to.
That is not the only thing I wanted to talk about.
I see in CGI there is a way to limit POST only, but no possible way to limit GET or the Cookies.
Is there a reason why those are not needed?
| [reply] |
| [reply] |
The most likely way you'd get recurring null bytes is if you are using UTF-16 for page character encoding. In that case filtering the null bytes is not the answer. Instead use an appropriate decoder.
If your problem stems from some other source you need to tell us more about the context of your problem.
Premature optimization is the root of all job security
| [reply] |
I started playing around with a filter that can do it. But I'm not sure where would be the best place to implement the filter. Should it do all the names and values of the param's or just as it calls them by name and only filter the values of the names?
No. Don't run stupidly guessing filters over your input, that wastes CPU time at best or opens other security holes at worst. Validate your input. Don't accept data that does not pass validation. Enabling taint mode (see perlsec) forces you to validate your input. That does not make your code absolutely secure, but it prevents oversights.
Imagine the simple-and-stupid integer calculator example:
- You expect to have three parameters left, operator, and right. There may be more parameters, but you either ignore them (friendly mode) or you refuse to work (see below this list) if you don't get exactly the three expected paramaters (paranoia mode). You do not import all parameters blindly into a namespace or into a hash or array.
- There must be only one parameter named left, i.e. $query->multi_param('left') must return a one-element array. You do not accept any other array size than one.
- There must be only one parameter named operator. See above.
- There must be only one parameter named right. See above.
- left and right must be numbers, i.e. they both must match /^(0|-?[1-9][0-9]*)$/. You do not accept any other value.
- operator must be a an operator, i.e. it must match /^(plus|minus|times|divided by|to the power of)$/. You do not accept any other value.
Should any of these tests fail, abort all further processing and emit an error page, typically with an HTTP status code of 400 (BAD REQUEST). Don't even try to do anything else before all parameters are validated. Especially do not try to open database connections or to aquire locks. Your attacker usually can send much more requests than you expect, so your machine may run out of resources quite fast. So your script must get rid of invalid or malicious requests as fast as possible, using as few resources as possible.
Update:
For your real application, you should create a similar list (or table) of requirements and checks for all parameters. If you don't expect file uploads, you can disable the upload handling in CGI.pm, see $CGI::DISABLE_UBLOADS in the CGI documentation.
When you validate parameter values, you want to check against a white list, i.e. specify allowed values. Blacklisting is not reliable.
Validating file uploads is a little bit tricky:
CGI.pm can attempt to limit the size of POST requests, but at that point, the web server may already have accepted an insanely large request. You may want to limit the request size in the webserver, too.
Even well-behaved browsers allow uploading arbitary junk, and things get worse when someone intentionally generates invalid uploads with malicious filenames, malicious mime types, malicious data in the file. So you have to validate all data you use from file uploads, and you have to validate the file contents. Just looking at some magic numbers like file, File::MMagic, and File::MimeInfo::Magic do, is not sufficient. See https://quadhead.de/storing-javascript-code-in-gif-images/ for a simple demo, http://jklmnn.de/imagejs/ for a tool to generate images containing executable javascript code.
Alexander
--
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
| [reply] [d/l] [select] |