I'm writing a webserver. Key features: inspects all files with `file(1)` before serving them and refuses to serve anything identified as Javascript or Flash. Refuses to serve any file larger than 1MiB. Inspects HTML files before serving and removes all <script>, <embed> and <iframe> tags. Refuses to serve HTML if div and/or span tags are nested more than 5 tags deep. Suggestions for additional "principles of non-violence" are welcome.

· Claverack · 2 · 2 · 4

@reinboar I'm not sure yet whether I will make this serve static files only, or also support CGI and maybe FastCGI. If it's static only, cookies aren't really a threat anyway (as far as I understand, happy to be corrected). If it supports dynamic content, absolutely, the server will not pass cookies or referers to the applications, and it will not permit the applications to send cookies or eTags (which can be used as a sneaky cookie substitute).

I can't remember if there's a way for the server to read a browser's cookies or not. I know they get set initially by the server via headers, but past that Idk.

@reinboar The browser automatically sends them along in the headers with any request to the server, based on the domain. The server has no way to request them directly. But a static site has no way to set them in the first place, or to interpret any that somehow got sent along, so until CGI arrives they're a non-threat.

@solderpunk Late reply, but wouldn't a static site be able to set cookies in response to a POST request for a form of some sort or are we ruling that out as static?

@reinboar My definition of "static website" is basically "a bunch of files served from the disk", so there's no way for a static website to process a POST request. That fundamentally requires CGI, FastCGI, or some newfangled work-alike.

Sign in to participate in the conversation

masto instance for the tildeverse