If you're a developer, like me, you might be thinking that SEO and analytics don't concern you and that they are things that only the marketing team, sales team and maybe the directors need to be interested in. If you develop purely back end systems, intranets or basically anything that is not openly accessible on the web, then you might be right, but if the projects you work on are ever likely to see the light of day by end users then I'm sorry to say you're wrong. Developers should learn the basics of analysis and SEO and here is why.
I'm not saying that you need to become the office's new SEO guru or that you should be the go-to guy for all things analytics, but you should at very least gain a basic understanding of the two so that you know:
- What they are used for
- Why they are important
- How they work
This will allow you to create sites which better service the end users, make life easier for your colleagues and ultimately mean you improve the overall quality of the web projects you work on.
So why should you be considering SEO and analytics when you build things and what can you implement to benefit them?
Provide accurate data
Once you understand what your analytics loving colleagues are trying to achieve, and the data they need to capture in order to achieve it, you may well find that you can add value by highlighting sources of inaccurate or incomplete data. More often than not the people who are requesting you to implement some tracking code and are playing around with the results do not know how to build a website or even know exactly how websites work. This can lead to them making inaccurate assumptions about what data they will receive in their reporting tools. For example assuming that everything they view as a 'page load' will be passed across as such when in fact, unless given special consideration by the dev team, any content loaded in dynamically after initial page load (i.e. AJAX) will not trigger 'page views' in most reporting tools. Similarly those who are not familiar with dynamically generated pages may be confused as to why Google has managed to get into an infinite loop whilst crawling the links on your site, something that can be prevented by adding a simple rel="nofollow" attribute to certain links.
Validate your assumptions
As a developer I get most of my job satisfaction from proving someone who disagrees with my development and design choices wrong and rubbing it in, normally with some level of 'I told you so' (don't pretend you don't enjoy this as well). Some times, especially in UI design and UX, what is good and what is bad can be a matter of perspective or preference, it can be difficult to prove your point that "That button is poorly placed and no one will use it" or to win over a client who claims "We don't need to bother spending extra time and money making the site responsive because no one uses the internet on their phones or tablets".
However, analytics and event tracking can help you back up your beliefs and design choices with some important statistics. For example Google Analytics will allow you to segment traffic to your website based on device type and, more recently, will even break it down for you based on viewport dimensions, making it much easier to make clear to the client that actually 65% of their websites traffic comes from non-desktop based devices. You can also track all user interactions with certain elements to show how much use they actually get, when combined with A/B testing you can basically remove all ammunition from the "I'm right, you're wrong" debate.
That said you might actually find the opposite to be true and that 20+ hour feature you developed which you swore would be appreciated and used by everyone, actually got used once or twice since you put it live last month. You never know, developers do get it wrong some times too (so I'm told at least :-P). At this point you must back peddle at an astounding pace, retract your comments or better yet claim you never made them, make light of it and keep your head down (probably muttering some excuse to yourself just out of earshot of everyone else) but most importantly learn from it and consider it next time you suggest adding that 'super cool' new feature you have seen on someone else's portfolio.
Provide semantic meaning and structure
If you are not sure what the definition of semantics is then let me explain it to you with an example (which wether you understand semantics or not, you have probably heard before) "fruit flies like a banana". Does that mean that 'fruit flies' like to eat/live on bananas or does it mean that if you throw any type of fruit, it will glide through the air in the same way a banana would? It's a bit of a silly example but you can see how you need some level of reasoning to determine what meaning the sentence has based on a given context. As a human with a brain you actually make these kind of decisions all of the time without even thinking about it for example when you process homographs. When was the last time you had to stop reading a sentence and consider if it was referring to 'wind' as in 'blows through the trees' or 'wind' as in 'wind your watch'.....exactly!
Computers and software however do not have a brain which is capable of reasoning in the same way you do though (at least not yet!) so how are they to know what you meant when you searched for 'wind'? A relatively new method of providing contextual meaning to the contents of our website and it's data is 'Structured Data' which can be used to facilitate for 'Linked Data'. Structured data is data that has been formatted and labeled in such a way that it fits a pre-defined context (aka a vocabulary). This vocabulary can be defined anywhere but the most well known and openly available set of vocabularies are maintained by schema.org. These vocabularies define relatively broad objects (i.e. a place) and their associated properties (i.e. address) and data types. Marking up your site can be achieved in a number of ways such as RDFa and JSON-LD and is a step beyond older methods of providing semantic meaning such as use of the <address> tags in HTML and means that search engines can gain a deeper understanding as to what the data and text on your site represents, this in turn means that if someone searches for "[your business] headquarters address" search engines are far more likely to return the user the relevant content on your website.
Structuring things in this way also helps create a web of 'linked data' where data is connected to other data which references the same 'real world' object (or fictional objects for that matter). For example linking the text 'Joe Harvey' to my Wikipedia page (if I had one). This would help search engines return content about me when someone searches for "Joe Harvey Jersey" instead of returning data relating to the deceased Newcastle Utd player & manager Joe Harvey for example.
Understand your user base
As well as allowing you to segment traffic by device type, many analytics tools will also provide you with many other demographics and allow you to segment on those. For example country, language, browser, operating system and ISP. Some of Google's advertising features will even go a step further and allow you to capture things like age and gender (when that data is available i.e. the user is logged in to their Google account). These demographics can help you understand your user base better and know where they located, what their preferred interface language is, what kind of browser and devices they use to view your site etc. This data can help you make informed decisions about what features and functionality to implement and how much effort to put in to being fully cross browser compliant, after all why would you bother investing hours of development time into supporting IE 6 if you only had a single visit from any IE based browser in the last year?
Get your site seen and used (and hide the parts you don't want the world to see)
Developers will probably be more interested in the second part of the above heading than the first, arguably even if you do nothing to 'help' search engines learn about the pages and content of your site they will do a half decent job of it under their own steam anyway (albeit slower than they could). This has it's pluses (there is less pressure on you to do anything) but it also comes with it's own issues namely that anything which is accessible via a URL can, and probably will, be crawled by search engines and indexed for search eventually. This includes all those lovely scripts and processes which you only intended to be used by your codebase not by end users (i.e. clientsite.com/some/dev-based/process/). It also means that alternative domains and/or URL's which can be used to access your site and it's content (i.e. client.mywebdevcompany.com) will also get indexed.
The impact of this can be anything from 'well that's embarrassing' to 'oh no, why has that resource intensive script intended for my use only been run 100 times in the last hour!'. This can be prevented through a number of methods, my favourite is normally to add the 'X-Robots-Tag: noindex' to any resources I do not want to be indexed for search but you can also use the robots.txt file, meta tags or rel="noindex" attributes. Google also offer a 'request removal' tool for when you forget to mark a resource as no index and need to retrospectively remove it from search results, which is good to know about.
Site maps and the use of tools like Google Search Console are a good way of getting search engines to better understand what the user facing elements of your site are, where they sit and how often they are updated etc. Most sites will receive the majority of their traffic from paid or organic search rather than direct traffic or referrals so getting the appropriate parts of your site indexed for search is important to ensuring your site actually gets used.
Improve the overall quality and usefulness of the sites you produce
At the end of the day despite the fact that we as developers find more pleasure in writing clean, clear and minimalistic code, there is little point in doing so if no one knows about or uses your site. It would be like building the perfect bike but not allowing it to ever be ridden, self gratifying maybe but otherwise pointless. Your users are far more likely to notice and appreciate a site which is easy to find and easy to use rather than one which is listed on the 10th page of search results for any given keywords and has poor UX even if that site has all the functionality they require and is well optimised in terms of performance.
In short there are many improvements you can make to your work by improving your sites UX and performance based on existing usage data and improving it's visibility. One key measure of a sites success is how much use it gets and analytics and SEO are key contributors to increased levels of traffic and perhaps more importantly users satisfaction.
If you would like us to audit your website to improve your SEO please get in touch