What things should a programmer implementing the technical details of a web site address before making the site public? If Jeff Atwood can forget about HttpOnly cookies, sitemaps, and cross-site request forgeries all in the same site, what important thing could I be forgetting as well?

I'm thinking about this from a web developer's perspective, such that someone else is creating the actual design and content for the site. So while usability and content may be more important than the platform, you the programmer have little say in that. What you do need to worry about is that your implementation of the platform is stable, performs well, is secure, and meets any other business goals (like not cost too much, take too long to build, and rank as well with Google as the content supports).

Think of this from the perspective of a developer who's done some work for intranet-type applications in a fairly trusted environment, and is about to have his first shot and putting out a potentially popular site for the entire big bad world wide web.

Also: I'm looking for something more specific than just a vague "web standards" response. I mean, HTML, JavaScript, and CSS over HTTP are pretty much a given, especially when I've already specified that you're a professional web developer. So going beyond that, Which standards? In what circumstances, and why? Provide a link to the standard's specification.

This question is community wiki, so please feel free to edit that answer to add links to good articles that will help explain or teach each particular point. To search in only the answers from this question, use the inquestion:this option.

1145 accepted

The idea here is that most of us should already know most of what is on this list. But there just might be one or two items you haven't really looked into before, don't fully understand, or maybe never even heard of.

Interface and User Experience



  • Implement caching if necessary, understand and use HTTP caching properly as well as HTML5 Manifest
  • Optimize images - don't use a 20 KB image for a repeating background
  • Learn how to gzip/deflate content (deflate is better)
  • Combine/concatenate multiple stylesheets or multiple script files to reduce number of browser connections and improve gzip ability to compress duplications between files
  • Take a look at the Yahoo Exceptional Performance site, lots of great guidelines including improving front-end performance and their YSlow tool. Google page speed is another tool for performance profiling. Both require Firebug installed.
  • Use CSS Image Sprites for small related images like toolbars (see the "minimize http requests" point)
  • Busy web sites should consider splitting components across domains. Specifically...
  • Static content (ie, images, CSS, JavaScript, and generally content that doesn't need access to cookies) should go in a separate domain that does not use cookies, because all cookies for a domain and it's subdomains are sent with every request to the domain and its subdomains. One good option here is to use a Content Delivery Network (CDN).
  • Minimize the total number of HTTP requests required for a browser to render the page.
  • Utilize Google Closure Compiler for JavaScript and other minification tools
  • Make sure there?s a favicon.ico file in the root of the site, i.e. /favicon.ico. Browsers will automatically request it, even if the icon isn?t mentioned in the HTML at all. If you don?t have a /favicon.ico, this will result in a lot of 404s, draining your server?s bandwidth.

SEO (Search Engine Optimization)

  • Use "search engine friendly" URL's, i.e. use example.com/pages/45-article-title instead of example.com/index.php?page=45
  • Don't use links that say "click here". You're wasting an SEO opportunity and it makes things harder for people with screen readers.
  • Have an XML sitemap, preferably in the default location /sitemap.xml.
  • Use <link rel="canonical" ... /> when you have multiple URLs that point to the same content, this issue can also be addressed from Google Webmaster Tools.
  • Use Google Webmaster Tools and Yahoo Site Explorer
  • Install Google Analytics right at the start (or an open source analysis tool like Piwik)
  • Know how robots.txt and search engine spiders work
  • Redirect requests (using 301 Moved Permanently) asking for www.example.com to example.com (or the other way round) to prevent splitting the google ranking between both sites
  • Know that there can be bad behaving spiders out there
  • If you have non-text content look into Google's sitemap extensions for video, etc. There is some good information about this in Tim Farley's answer.


  • Understand HTTP and things like GET, POST, sessions, cookies, and what it means to be "stateless".
  • Write your XHTML/HTML and CSS according to the W3C specifications and make sure they validate. The goal here is to avoid browser quirks modes and as a bonus make it much easier to work with non-standard browsers like screen readers and mobile devices.
  • Understand how JavaScript is processed in the browser.
  • Understand how JavaScript, style sheets, and other resources used by your page are loaded and consider their impact on perceived performance. It may be appropriate in some cases to move scripts to the bottom of your pages.
  • Understand how the JavaScript sandbox works, especially if you intend to use iframes.
  • Be aware that JavaScript can and will be disabled, and that Ajax is therefore an extension not a baseline. Even if most normal users leave it on now, remember that NoScript is becoming more popular, mobile devices may not work as expected, and Google won't run most of your JavaScript when indexing the site.
  • Learn the difference between 301 and 302 redirects (this is also an SEO issue).
  • Learn as much as you possibly can about your deployment platform
  • Consider using a Reset Style Sheet
  • Consider JavaScript frameworks (such as jQuery, MooTools, or Prototype), which will hide a lot of the browser differences when using JavaScript for DOM manipulation

Bug fixing

  • Understand you'll spend 20% of the time coding and 80% of it maintaining, so code accordingly
  • Set up a good error reporting solution
  • Have some system for people to contact you with suggestions and criticism.
  • Document how the application works for future support staff and people performing maintenance
  • Make frequent backups! (And make sure those backups are functional) Ed Lucas's answer has some advice. Have a Restore strategy, not just a Backup strategy.
  • Use a version control system to store your files, such as Subversion or Git
  • Don't forget to do your Unit Testing. Frameworks like Selenium can help.

Lots of stuff omitted not necessarily because they're not useful answers, but because they're either too detailed, out of scope, or go a bit too far for someone looking to get an overview of the things they should know. If you're one of those people you can read the rest of the answers to get more detailed information about the things mentioned in this list. If I get the time I'll add links to the various answers that contain the things mentioned in this list if the answers go into detail about these things. Please feel free to edit this as well, I probably missed some stuff or made some mistakes.


Rule number one of security:

Never trust user input.

  • Never put email addresses in plain text because they will get spammed to death
  • Be ultra paranoid about security.
  • Get it looking correct in Firefox first, then Internet Explorer.
  • Avoid links that say "click here". It ruins a perfectly good SEO opportunity.
  • Understand that you'll spend 20% of the time developing, 80% maintaining, so code it accordingly.
  • Web standards: it's cheaper to aim for standards than testing for every browser available (and in a public website you will see a lot of different browsers/version/OS combinations (30+))
  • SEO-friendly URLs: changing URLs later in the game is quite painful for the developers and the site will most probably take a PageRank hit.
  • Understand HTTP. If you have only worked with ASP.NET webforms, then you probably don't really understand HTTP. I know people that have worked with webforms for years and they don't know the difference between a GET and a POST, let alone cookies or how session works.
  • HTTP Caching: Understand what to cache and what NOT to cache.
  • Optimize image weights. It's not cool to have a 20 KB image for a repeating background...
  • Read and understand Yahoo's best practices (http://developer.yahoo.com/performance/rules.html). Not every rule applies to every website.
  • Use YSlow for guidance, but understand its limitations.
  • Understand how JavaScript is processed on the browser. If you put tons of external scripts at the beginning of your page, it's going to take forever to load.
  • Consider cell phone usability: some users will access your site using their native cell phone browser (I'm not talking about iPhones or Opera Mini). If your site is pure Ajax, they will probably be out of luck.
  • Learn the difference between 301 and 302 redirects: it's not the same for search engines.
  • Set up Google Analytics (or any other analytics package) right from the start.

Not specific to public websites, but useful nevertheless:

  • Server caching: identify and exploit any caching opportunities, it makes a big performance difference. It's often overlooked on non-public websites.
  • Set up a good error reporting solution, with as many details as possible. You will get a lot of errors when you launch, no matter how much you tested, so you better get all the details you can.
  • Set up an Operation Database (see for example http://ayende.com/Blog/archive/2008/05/13/DevTeach-Home-Grown-Production-System-Monitoring-and-Reports.aspx) so you can quickly identify bottlenecks.
  • Set up a good deployment strategy. You will probably deploy more often than non-public sites (we deploy daily).
  • Realize that web applications are inherently multi-threaded, you will have lots of visitors (typically much more than in non-public websites), and threads are not unlimited.


  • Filter and validate incoming user input ('amount' does not need to accept alphabetical characters) and escape outgoing user input (a ' in user input, is NOT the same as an SQL ').
    Never trust any data given by the user.
  • And the above will help with protecting against SQL injection.
  • Understand SSL
  • Keep your systems up to date with the latest patches.
  • Protect yourself from cross site scripting
  • How to resist session hijacking
  • Find out about HTTPOnly cookies
  • How to handle authentication/permissions
  • Understand PKI (public keys)
  • Keep up to date! This is the most important thing, make sure to follow all the latest information about possible security issues and vulnerabilities that affect your platform.


  • Create SEO friendly URLs - example.com/articles/rampaging-bull-tramples-unicorn NOT example.com?article=45
  • Use an XML sitemap so that site engines can crawl your site more intelligently
  • Set up Google Analytics (or another analytics package) from the start
  • Learn the difference between 301 and 302 redirects: it's not the same for search engines.
  • Set up a robots.txt file



  • Documentation!
  • Code from the beginning with maintainability in mind
  • Have a good deployment strategy - don't save it to the very end to figure this out.
  • URLs designed with REST in mind could save you a headache in the future.
  • Use patterns like MVC to separate your application flow from your database logic.
  • Be aware of the many frameworks out there that will speed up your development
  • Use staging and a version control system to deploy updates so that your users won't be affected
  • Set up an error logging system. No matter how well coded your website will have errors when it is released. Don't wait for the user to let you know; be proactive in identifying errors and bugs
  • Have a bug tracker
  • Know your environment. Your OS, language, database. When you need to debug it will be important to understand how these things work at a basic level in the least.

User experience

  • Be aware of accessibility. This is a legal requirement for some programmers in some jurisdictions. Even if it's not, you should bear it in mind.
  • Never put email addresses in plain text, or they will be spammed to death.
  • Have some method for users to submit their comments and suggestions
  • Catch errors and don't display them to the user; display something they can understand instead
  • Remember that cell phones and other mobile devices with browsers are becoming more common. Sometimes they have very poor JavaScript support. Will your site look okay on one of these?

Core Web technologies

  • Understand HTTP, and things like GET, POST, cookies and sessions.
  • How to work with absolute and relative paths
  • Realize that web applications are inherently multi-threaded, you will have lots of visitors (typically much more than in non-public websites), and threads are not unlimited.

I, personally, avoid using extensions like .php in my URLs. For example:



Not only does the first URL look cleaner, but if I decided to switch languages, it would be less of an issue.

So how does one implement this? Here is the .htaccess code I found works best:

# If requested URL-path plus ".php" exists as a file
RewriteCond %{DOCUMENT_ROOT}/$1.php -f
# Rewrite to append ".php" to extensionless URL-path
RewriteRule ^(([^/]+/)*[^.]+)$ /$1.php [L]

Source: http://www.webmasterworld.com/apache/3609508.htm


I'll add one:

  • how to do caching

Here are a couple of thoughts.

First, staging. For most simple sites developers overlook the idea of having one or more test or staging environments available to smoothly implement changes to architecture, code or sweeping content. Once the site is live, you must have a way to make changes in a controlled way so the production users aren't negatively affected. This is most effectively implemented in conjunction with the use of a version control system (CVS, Subversion, etc.) and an automated build mechansim (Ant, NAnt, etc.).

Second, backups! This is especially relevant if you have a database back-end serving content or transaction information. Never rely on the hosting provider's nightly tape backups to save you from catastrophe. Make triple-sure you have an appropriate backup and restore strategy mapped out just in case a critical production element gets destroyed (database table, configuration file, whatever).

  1. Web standards
  2. Awareness of browsers
  3. Awareness of accessibility
  4. Awareness of usability

In addition to caching


It might be a bit outside of the scope, but I'd say that knowing how robots.txt and search engine spiders work is a plus.


Aside from basic competence in the base language and the key technologies which might be assumed (although shouldn't be taken for granted):

  • Platform - no good attempting to develop an ASP.NET application for a server that doesn't support .NET, no good attempting to provide a SQL Server database to be hosted on a MySQL Server... etc.
  • Deadline/Budget - Does it need to be done by next week and therefore potentially has lots of quick hacks and workarounds vs. coding to strict standards and doing everything the right way.
  • Content - who is providing it? Has it been vetted for quality and approved for publication? Have all applicable copyrights been checked? Etc.
  • Team/Stakeholders - Who needs to be kept in the loop for development, who will the developer be working with, who do they need to keep happy? Etc. Will there be a designer or is the developer the designer too? Don't hire a top notch developer and assume their design skills are all that - most of them are not. I get by and can make something that looks reasonably professional, but I wouldn't consider myself a designer even by a long stretch of the imagination.
  • Target Audience - savvy, not-savvy, intranet, Internet. Make no assumptions here, there's a great quote that goes "Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning."
  • Hardware Base - how much performance has/have the host machine(s) got? Do we have to be concerned about limited memory/diskspace/resources? Obviously if it's only got a small amount of memory, then we need to make sure that minimal memory resources are used in the design of our application. Likewise for diskspace, etc.
  • Platform - overall architectural/network topology
  • Maintenance - who will be maintaining this product? If the maintenance crew all have a VB background and haven't the first clue about PHP or C#, don't write it in those languages!! If the maintenance crew is you, then code in whatever you're most comfortable in.

This is all before you even get to a web environment really. Once you get into a web environment you would really expect them to understand (in no particular order):

  • Stateless interfaces
  • Web protocols (HTTP/HTTPS/FTP) etc
  • JavaScript and/or other relevant client-side coding techniques
  • Various Persistence techniques - Cookies, Sessions, ViewState, ObjectState (and/or any others that relate to the APIs being used)
  • At least a basic understanding of HTTP handlers and how they do their job
  • Page Lifecycle
  • Security in web environments - XSS, SQL Injection, Session hijacking, etc., etc.

After that:

  • Competence in the language used to develop the site
  • Knowledge of standards and best practices and an ability to apply them effectively
  • A good understanding of Cross browser techniques and hacks
  • CSS techniques and standards (if the developer is expected to design too)
  • Understanding of various browsers and their idiosynchrosies and workarounds

And then - if your site is data driven

  • An understanding of the database technologies to be used
  • RDBMS design and performance tuning if you're asking them to design the underlying database. If you've got a DBA for that, then this is not such a major concern.

Well, everyone else has already mentioned most things I thought of - but one thing I always forget is a favicon. Sounds stupid, I know, but I think it's one of those little things that helps to emphasise your brand, and I never seem to remember it. Please check Scott Hanselman's post about how to use it carefully.

I agree with some of the rest too - I think it's important to know as much as possible about your chosen language, so that you can code it with best practices and maintainability in mind. I've come across functions and patterns that I wish I'd known about when I did my first few crappy, amateur projects, as it would have saved me writing some retarded WTF-ey workarounds!


When to say "no" to the designer or client, and how to do so gracefully and diplomatically.


The cruel, hard facts:

Users spend as much time on your website as an interviewer does reading your resume when submitted in a pile of thousands of others

  • Users spend very little time on your website: Read, seconds.
  • Users are lazy and they would rather be somewhere else
  • If the user can't find what they are looking for within seconds, they leave
  • If the user cannot identify what the website is all about, they leave
  • If the website does not 'just work', they leave
  • If the website annoys the user or does not appeal aesthetically to him, they leave

Everything about websites and website design revolves around these facts.

  • Clear Navigation
  • Conciseness
  • Branding strategies
  • Colors, schemes, aesthetics, text placement, text formatting
  • Helpful, not hindering, Ajax/JavaScript
  • Not reinventing the wheel when it comes to website use, navigation, etc.

This is just an outline on why it is so important to adhere to standards and read those website design books.


You also have to:

  • Keep your system up to date with the latest patches.
  • Keep yourself up to date with knowledge of new vulnerabilities affecting your platform, and attack techniques in general.

I follow several security related blogs and podcasts.

In addition, I get email alerts from SANS https://portal.sans.org/. (you need to register, but it's a great source).

(I'm always interested in learing about other good sources, too).


If you have any influence on design, please read, "Don't Make Me Think" by Steve Krug. It is an easy read, and will almost certainly make you think...


How to work with absolute and relative paths.


I would think that knowing all you can about your deployment environment would rank up there.

IIS, MSSQL or Apache, MySQL, etc? ASP.NET, PHP, etc.?

Perhaps this is a no-brainer, but surely someone out there has written code that relies on [insert dependency] only to find out their client's server was missing [aforementioned dependency].

  • Valid (X)HTML - with the appropriate tags.
  • No broken links (See above about relative links)

Good thread. Here are some areas I think no one's mentioned:

Accessibility (A11y), WAI-ARIA tags and so forth and since it's 2010, why not start adding some HTML5 into the mix also.

checkout Selenium for jUnit-izing client-side testing.

And lastly, Content Distribution Networks, don't host your static files if you can avoid them. e.g. Akamai or Google's instance of JQuery.

  • Consider URLs, a URL design with REST in mind could make exposing APIs easier in the future. Definitely much easier to get your URLs right the first time then to change them in the future and deal with the SEO consequences.
  • Read Josh Porter's book Designing for the Social Web.
  • Have some way to accept criticism and suggestions.
  • Know what progressive enhancement an graceful degradation are, JavaScript is NOT a requirement to operate the web and should be treated as such.

How to build a scalable design in the off chance that the site becomes really popular.


Know how to hinder Denial of Service (DoS) attacks on user login forms by keeping track of the number of failed logins over a given period of time. In the event you hit a certain threshold above the running average, increase the duration of all recurring login attempts by a particular amount (say 5 sec.).

Someone feel free to modify for clarity :)



  • Consider using an Application Firewall such as UrlScan that works by blocking specific HTTP requests, UrlScan helps prevent potentially harmful requests from being processed by web applications on the server.
  • Disable Directory listing.
  • Consider using a lower privilege identity.
  • Don?t use Blacklists, but use Whitelists instead, teach your application what to accept, not what to avoid.
  • If using ASP.NET then encrypt your connection strings using aspnet_regiis. This tool is so easy to use and requires simple steps to both encrypt and decrypt connection?s strings.
  • Pages with sensitive data should not be cached: page content is easily accessed using the browser?s history.
  • Validate user inputs in the application, promote the use of Regular Expressions (and be assured that they work the way they are meant to be)
  • Avoid, at all costs, client side validation (e.g. using Ajax or all JavaScript related validation libraries). JavaScript can and will, be turned off and so your protections).
  • Do NOT use GET for anything that changes the server state or contains sensitive information. GET requests are logged in the web server access logs. They are also shown in the browser history.
  • DO use POST for every action that changes the server state and reject all non-POST methods. POST prevents unintentional actions, most search engines won?t crawl POST forms and it also helps prevent duplicate submissions.
  • If using Cookies, mark them as HTTPOnly using System.Net.Cookie. Set the httpOnlyCookies attribute on the authentication cookie. Internet Explorer Service Pack 1 supports this attribute, which prevents client-side script from accessing the cookie from the document.cookie property.
  • Robots.txt files are the first place hackers look at. Use access controls to protect them.
  • When constructing SQL queries, use type safe SQL parameters. AKA use stored procedures and stored procedures only. Using stored procedures is always the best approach, from both technical and security point of views.
  • If using SQL Server, use a least privileges user. Create a SQL Server login for the account. Map the login to a database user in the required database. Place the database user in a database role. Grant the database role limited permissions to only those stored procedures or table your application really needs. By using a database role, you avoid granting permissions directly to the database user. This isolate you from potential damage to the database.
  • Use SSL were possible, this will encrypt and protect your data while on the wire. Using SSL doesn?t necessary means you are secure. It simply means your data is encrypted while on the go. If using SSL, restrict authentication tickets to HTTPS connections only.


  • Minimize HTTP Requests
  • Add an Expires or a Cache-Control Header
  • Gzip Components
  • Put Stylesheets at the Top
  • Put Scripts at the Bottom
  • Minify JavaScript and CSS
  • Optimize Images

Found a new one today:

  • Reset Style sheets

They style sheets you included as a base-line when starting a project to give you more consistent behavior across different browsers. See this question:


On a public site, make sure you are using an XML sitemap to help search engine crawlers crawl your content more intelligently.

If you have non-HTML content on your site, you should also look into Google's extensions of the sitemap protocol to make sure you are using whatever is appropriate. They have specific extensions for News, Video, Code, Mobile-specific content and Geospatial content.

One thing I learned that was not obvious in the Google help, is that each of these content-specific sitemaps should be a separate file and joined together at the root with a sitemap index file. For some reason Google doesn't like you to mix content in one sitemap. Also, when you use Google Webmaster tools to tell Google about your sitemaps, tell it about each of the special sitemaps you have separately and use the drop-down to specify the type. You would think the crawler could use the XML to auto-detect this stuff, but apparently not.


Ensure that whatever framework/server-side scripting/web server/other you're using doesn't expose error messages directly to the user.

Checking that whatever has been put in place to facilitate the above during development is switched off or reversed. Obviously the preference is to have this stuff properly configured in first place - but it will still occur time and time again.

That's mainly written from a security standpoint, but very much related is the usability issue of ensuring that should errors occur, the user get something that makes sense to them and tries as best possible to get them back to what they were doing.


Good knowledge of HTTP, including caching and expiry headers


How to avoid Cross site request forgeries (XSRF) (this is not cross site scripting (XSS))

Now I'll probably be modded down for overuse of parentheses.


I don't have sufficient good karma to edit, my contribution: Looks like everyone here is from the US :)

i18n and l10n

  • Use the correct character encoding for your web page (charset encoding)
  • Read the Accept-Language to define the page rendering language. I have seen too many web sites that localise depending upon my IP address and ignore "Accept-Language" header information! Painful as I have no idea how to view the site in English anymore.
  • Localise for users timezone. It's difficult to get this right, but users will appreciate the fact.
  • Format numbers, currency, date, time, address and names as per users region. Default to IP address based localisation if you don't have users region information in profile.

make sure (unlike me) you dont develop your site using FF3 and IE8 and then at the end, check IE7 and see that it looks a mess and need to spend days tweeking it.

always check the site renders ok in a number of different browsers during development, dont leave it till the end.


Most of the essentials have been covered by the top 10 answers, but here are a few of the ones I missed up there:

  • For browser compatibility testing, use browsershots.org (free) or better yet, litmus (cheap)

  • For stress testing, use the command line tool ab - ApacheBench (on Linux/Mac OS X). It will let you find the 'heavier' pages, so you can do your performance tweaking where it will matter the most (that is, caching!). "A slow page is a DoS attack waiting to happen."

  • If you, like most, will be using a web host rather than hosting your own web server, spend a couple of weeks (yes, weeks!) on the WebHostingTalk.com forum to get a feel for which hosting providers are currently the best in the lands. That forum is THE one and only gathering place for serious web hosting nerds, and these cats have the dirt on everyone. If you are serious about your web sites, you need to background check your hosting providers on WebHostingTalk.

  • Use a remote distributed system for monitoring your uptime (e.g. to determine whether it's time to move to a different hosting provider) - host-tracker.com comes to mind, but there are many others

  • Do not write your own CAPTCHA. I repeat: Do NOT write your own CAPTCHA!


Begin by designing your page as if HTML was your only tool and JavaScript and CSS didn't exist, and make sure it validates. (This is not an excuse to use <font> tags, I'm talking about making good semantic code here!)

Then, add CSS (from an external file), and gently style your work, adding as little extra HTML as possible.

Finally, make your JavaScript (I'd use jQuery) enhance the user experience - again adding as little extra markup as possible.


Cross-browser support, particularly with respect to CSS.


You should consult the OWASP web site and understand the vulnerabilities listed there. Keep in mind OWASP does not talk about issues like scalability, session state management issues, and browser compatibility. Those areas will need to be understood as well. But I would argue that they certainly are less important than security.


If you implement a "I forgot my password" feature, don't email their password back in plaintext. Instead, email them a time-expiring link which will take them to a page that allows them to select a new password.



This is an interesting video (more on web application, but always a good thing)


Web standards:

  • HTML
  • CSS
  • XML
  • JavaScript

HTTP protocols.
UI Design
Web Security
Web Caching
Some web server knowledge (Apache httpd, ISS, lighttpd)


Set aside all the technical aspects, skills and security, I would make sure that it would be easy to use and really does the thing the user expect. Human computer interaction is important. Layot and flow is important. Otherwise no one will use it, other that scammers, spammers and robots.




How to handle the Slashdot effect.


Having a backup strategy is really important (as it's already been mentioned) but checking the backup is equaly as important. There is no point having 100s of backups if they are all corrupt. Your restoration strategy should be known and tested depending on the needs of the business.


HTTP Compression is often overlooked and can drastically speed up a website.


A web developer should know:

  • Jakob Nielsen's Alertbox
  • That less is more
  • How to decouple presentation (HTML, CSS) from business logic (JavaScript, backend)

What should a developer know before building a public web site?

What about the data?

  1. Data normalization
  2. Design Query structure of the data carefully
  3. Optimize it and understand where to cache or not
  1. Cross Browser Compatibility

  2. SEO

  3. Horizontal /Vertical Scaling

  4. Advantages/ Disadvantages of Caching


Duplicate slashes in a path are normally harmless, but <a href="//index.html"> does not mean what you think it means.


Read about the Principle of least astonishment in "Principle of least astonishment" and "User-Friendly Programming".


One very important thing for UI Heavy Sites is taking care of screen resolution. It can totally make or break the UI experience of your site.


Have a basic understanding of Web Analytics so they can understand how the users are interacting with their site.

  • Web standards
  • CSS
  • Interface Design

If it's unusable, you have no chance!


From a systems perspective, document how the application works and the subsystems involved and add instrumentation to the application for the systems in which it will run (e.g. event logs or performace monitor in Windows).

The application has to be run by some support personnel and they need tools to track possible problems that may appear.


Consider your design from your potential users perspectives. How will they use the site? What will benefit them most? What will annoy, frustrate, or keep them from using it? If you're trying to decide on a design element that will benefit you, but not the user, scrap it.


I am new in Web Development and what I faced problem with are

  1. Detail knowledge about JavaScript and Ajax.
  2. Security. Specially XSS and CSRF etc.
  3. Some knowledge about CSS even if there are dedicated designers for it.
  4. Adherence of W3C standards or others.
  5. Deployment issues and how to solve them.
  6. Browsers and How they work. Same origin policy and why it is important.

Regarding credit cards and debit cards, at least within the United States, be aware of PCI compliance and the various rules and responsibilities that it covers. Accepting credit cards for a small e-commerce application can open a very nasty can of worms if the proper security measures are not in place. It goes way beyond having SSL enabled on the web site. Search for PCI-DSS on your favorite search engine and make sure you, and your clients, understand the regulations that they will need to follow. Other locales have similar rules under different names, but all of the major payment card players are getting serious about securing cardholder data.


Especially for SEO, but for some other reasons as well: remove session id's from (public) URLs, that might have been added by the web framework for cookie-less browsers, but may not be required for public browsing anyhow.


This may have already been mentioned, but knowing how the client plans on updating the site. If the client has someone who "knows HTML", then prepare for problems. It's best to have a good CMS in place for updates if the client wishes to update the website themselves, NEVER let them have access to all of your code.


Develop for Gecko and Webkit browsers first, then use conditional comments to address IE issues that cannot be fixed by tweaking CSS (e.g. for more specificity, rules that trigger IE's 'hasLayout', etc.)


Nothing. If you're building your first site, just build it. Get dirty, make mistakes and learn. Because after you've built hundreds and lots of advanced tricks are second hand to you, after you've done it all, seen it all, the one thing you'll always need to remember is the one thing you know when you start: you don't know everything. Especially if you're worried about security. Even if you cover all the bases, someone will come up with something new. It's the downside to being one of the Good Guys.


You need very little knowledge to put a site out to the public. Don't forget that there are billions of sites out there, and you don't want to spend months of your valuable time building something that nobody wants.

All the skills you need are basic HTML, CSS and JavaScript to quickly throw up a prototype and put it out in front of the big bad web. Think about it this way - if you build out something really awesome in several months, let's say, and you put it out on web, and nobody clicks on that link to Get Started, then something has gone terribly wrong.

Either you were working on the wrong problem, or a problem that nobody had, or they didn't know they had a problem. You could simply test your early hypothesis by putting up a nice fancy mockup landing page with a link saying "Get Started", and when users click that, you take them to a thank you page asking them for their email/contact information to inform them for when you do actually go live.

I have recently been introduced to this idea of a Minimum Viable Product (MVP) which is very radical in terms of what it is. It's not a minimum viable product in the sense that most developers would think of it as. Here's a nice interview with Eric Ries that talks about the idea in detail - http://venturehacks.com/articles/minimum-viable-product.

Kent Beck, the creator of Extreme Programming methodologies had an interesting story to share in the Startups Lessons Learnt conference today in San Francisco. He had an idea of introducing a payment gateway to charge users for unlocking higher levels of a game he was building. They estimated it was going to take a little while to implement the whole thing, so he decided to just put up a button saying "Buy the Next Level" on the game page. When users clicked that link, they just let them into the next level without charging or anything. But it didn't hurt them at all as they didn't have a million+ user base, and they collected valuable information about how many users were actually willing to buy the next level.

So I would recommend you don't wait until you build a nicely polished and finished product before reaching out to your users. And to get started with that, you don't need a whole lot of knowledge. Basic HTML/CSS/JavaScript skills are more than sufficient to get started.


I agree with "The Professor" there's no point in having a beautifully built site that validates correctly and is accessible to all if the content is rubbish. In addition to his comment though I'd add spell checking and proof reading. I find that the majority of tweaks that have to be made after the site has gone live is down to spelling/grammatical issues.


If you are going to accept user input, learn input validation. This is the biggest thing that programmers make mistakes on, they accept user input in random location and it allows script kiddies to come along and remote include a file that then gives them full control over your local machine.

"Be lenient in what you accept, but strict in what you output"

However, don't trust any user generated input in any way shape or form. Don't trust it!


Understand how to monitor a site for intrusion and make it easy for the person who manages the site to recover to a known-good state. Even if you aren't going to be managing the site you should educate the site-owner in this regard before handing it over.

Even if your code is bulletproof, the server that the site is hosted on can be compromised (especially in a shared-server environment), so it seems like it's not so much a question of whether your site will be hacked, as when it will happen and how much pain will be involved in cleaning it up.

So you'll want to design with this in mind; e.g., craft your URL scheme such that it is easy to spot malicious requests in the access logs; think carefully before storing page templates in a database; and so forth.


Site design and development with thinking of localization feature for other languages.


Need to know what is easy to use for the public, not for an IT professional or software developer.


Take a look at a good web usability book, e.g. Don't Make Me Think: A Common-Sense Approach to Web Usability, by Steve Krug.


If for some reason you don't trust Google or you want to have more control over the collected data, try Piwik as analysis tool. It is open source and extensible via plugins.


The most important thing for a web site developer to know is that there really is no such thing as a standard. The standards exist, but are often ignored or are incorrectly implemented.

The only way to know if your pages are going to operate correctly on all web browsers is to try it on every browser you can find: IE, Firefox, Opera, Safari, and Chorme for a start.

So, yes, of course, use standard practices. But then test and remove those features which do not work across all browsers.

  • One of the key things is to understand how you are going to debug your system. This means understanding the 'big picture'. So know your environment (O/S, database, framework, networking et al) and at least know where to 'look' if you have ten users each calling with their on issue even if you did not write all that server side code.

  • Often times, good user interface design (error logging with the right amount of detail, log levels, hooks to display some details on demand) will go a long way.


Know how to resist session highjacking. Http_only is only one aspect of this, and not necessarily the most likely for some threat models (it applies when people can insert html onto your site).

There are session highjacking attacks which are regarded as remotely executable by NIST, and exploits are in the wild today. Here are some refs:



  • how SSL works
  • how PKI works
  • how cookies are used to manage sessions

Make sure someone in the organization already has the content maintenance, ongoing SEO and marketing plan worked out fully. Because if they haven't, they're going to default to you to provide all of those things (possibly with little compensation).


Even though there are many things to consider here you should not neglect the performace of the website,especially the loading time. This can be achieved through caching,gzip.deflate compression etc. Check a list of things you must do for faster website at the below url 7 steps to Speed up website loading ? Website Optimization Tips -Part 1


What about languages?

  • HTML
  • JavaScript
  • PHP/.NET/Python
  • Ajax