Introduction

In the information age having a strong online presence is essential for any company or individual. With the ever growing demand for online services, digitalization and online social interactions, the role of the good looking and properly working web site has become an integral part of any success story. This is why assessing the quality of website is essential.

Internet technologies are changing the way business is done. Creating an online business now is easier than ever and everyone wants to be part of it. However the competition is brutal. This is why in order to survive in the modern online world a website needs to be of top quality.

In this post we will first outline the key aspects of for evaluating the quality of website. Afterwards we will discuss the different testing strategies we can use to evaluate how well the website is performing according to the quality aspects. And finally we will conclude with real life examples of website issues, tests and performance reports.

Measures for quality of website

Performance

Website performance snapics.co

Performance includes various technical speed measures of the web site.

Page load speed

Page load speed is the time needed for the page to be fully functional. For various purposes this time is divided into several checkpoints:

  • First contentful paint – time of text appearing on the page
  • First meaningful pain – time of styled content is rendered on the page
  • Time to interactive – all resources are loaded and user can interact with the page
  • Deferred resources load – non critical resources are loaded

There are three key factors that impact page load speed:

  • Network – the time needed for the server to transfer the website data to the user.
  • Protocol – typically HTTP, the way data is encoded and loaded into browsers
  • Source code – the framework’s specifics on how the page is loaded and rendered by the browser

This information is taken from Google’s Lighthouse performance testing framework. The article also includes other tips for improving the quality of website, so it is a good read.

Service requests

Typically there is a web server sitting behind every website. Furthermore requests to this server greatly influence the performance of the website. However it is impossible to define an architecture which will work for all cases but we can outline few practices which generally work well in most cases

Granular data transfers

Requests should be small and fast. Data should be chunked and only relevant parts should be loaded. When inserting multiple items of data, consider using bulk requests to store them in a single request.

Repetitive requests

Data can be requested once and then stored inside the browser for further use. This will improve the website performance and also reduce the load on the application server serving the data.

Caching

Heavy operations and resources should be cached and fetched on demand. Cache can be client side and server side. With client side cache data is stored in the browser of the users. Typically this is used for images, fonts or videos. However server side cache is defined in the server and all clients can benefit from it. Additionally server side cache is used to store data from third party services or resource heavy operations.

Content delivery network (CDN)

Usually the main performance issues in a website come from its static resources. They are the biggest chunk of data that needs to be transferred to the client before the web page can become useful.

CDNs are proxy server networks that keep a copy of the website’s static resources and are spread throughout the globe. This way clients can connect to the closest CDN server and download the resources with minimum network delay. All international websites should leverage this technology.

Mobile friendliness

Website mobile friendliness snapics.co

Mobile phones are usually ‘weak’ devices with limited resources and network connectivity. Additionally a web page may perform really well on desktop machines and very poorly on phones. However around 50% of web traffic is mobile and businesses just can’t afford to lose half of their customers. This is why all modern websites should be prepared for its mobile users.

Responsiveness

Elements on the web page should dynamically adjust to the screen size. The layout should change if needed and text, image and video sizes should be adjusted for the aspect ratio and the network connectivity.

Progressive web app

The website should remain operational even if it loses internet connection. All functionalities that don’t explicitly need internet connection to work should still be active. Furthermore if possible user updates should be stored and applied when the device is back online

Native mobile application

Mobile applications are optimized and designed to be very efficient and fast on mobile devices. Most modern websites have mobile versions and upon opening them via phone you are prompted to install this native app. This will save internet megabytes and battery.

Accessibility

accessibility quality of website snapics.co

Accessibility is referring to the site’s ease of use. This covers regular users and users with health conditions like light sensitivity, blindness, deaf and other. Some of the users might need extra help to use the website. For example there could be an audio version of the text for blind people or dark themes for light sensitive users.

Proper html meta attributes

Modern browsers already have support for most of the accessibility use cases, but proper meta tags in the source code need to be used in order to enable them.

  • Navigation attributes: allow for easy navigation through the web page (which could be vital for paralyzed users)
  • Alt attributes: alt descriptions for images and videos (for people using the audio version of the page)
  • Keyboard control support

Contrast ratio

There is a complex formula that can calculate the optimal contrast between text and background color. This is one of the most important factors in the readability of the page alongside its actual contents

Best practices

Website best practices snapics.co

Over the years the web developer community came up with an extensive list of good practices and common patterns to follow when building web sites. Additionally they ensure the code will run in the same way on all browsers and can reliably leverage the available features the target browser supports.

Cross browser compatibility

New browser updates come up every week. Both web and browser developers need to be aligned on the direction of the technologies that are being used. This is why both parties must follow previously agreed principles to keep old websites working and new one modern.

Cost of maintenance

By following the commonly accepted practices for building websites it is easier for developers and quality assurance engineers to switch projects or be onboarded in a new one. Furthermore with the rise of open source projects following practices can open many opportunities for the project if built correctly. This in terms can save a big chunk of the project budget.

Stability

In general common practices are inspired by accidents and outages in the past. Same goes for website development best practices. By following them in the project the teams can assure base level of stability and ease of debugging and testing.

Using the right framework

The web world is extremely diverse with tens of frameworks with huge communities and resources. Many teams fall into the trap of choosing not the right framework but their favorite. Before going with a particular technology it should be evaluated and aligned with all levels of the project development. Those are product, management, developments and quality assurance.

Security

quality of website - security snapics.co

Hackers have been around for a long time. However, due to the increasing need to store more and more data into digital carriers in the cloud the hacker attacks have become more frequent and more devastating. Furthermore they are oftentimes hard to detect and malicious users can be downloading the data on the website without the team even realizing.

Access control

Access to the data on the website should be given to only a handful of users. Furthermore access should be granular, meaning users should be able to access only data that is relevant to their work or roles.

Encrypted connections

All access to the website should be SSL protected. Most browsers warn users when a certain website is not secured or its certificate is invalid. To make the website trustworthy having ssl connection is an absolute must. Furthermore encryption can stop most hacker attacks early on the network layer.

DDOS protection

Distributed denial of service is the most common attack out there, this is why it gets its own sub section. Hackers use thousands of computers to execute simultaneous requests to the website (preferably heavy data extractions or insertions). This causes the server resources to be exhausted and ultimately bringing the services down.
There are various big companies that get blackmailed with DDOS threats. Most commonly ecommerce websites are targeted on particular days with high volume of traffic, like black fridays.

SEO

SEO quality of website

Search engine optimization is a field in digital marketing that aims to improve the website search engine performance. The most popular search engine is owned by Google, but there are others also. There are various features a website should possess in order to perform well in search engines.

Inbound and outbound links

All websites that are globally accessible are part of the world wide web (www). The web pages from a web with connections between them. Search engines use these links to group relevant websites together.

Outbound (external) links help to define the topic of the web site. Whereas inbound links (backlinks) define the authority or the trustworthiness of the website.

Unique Content

Search engines nowadays use advanced natural language processing techniques to identify relevant content. They easily track pages that are duplicated (stoled) and set the older one as original and penalize the duplicate. Also receiving penalties will impact the whole website search performance.

Relevant Content

Search engines want to show their users what they are looking for. This is why it makes sense to search for the best match for the search phrase text wise. Moreover they observe user behaviour upon entering the web page. Depending on their actions they further evaluate the quality of the website. For example the user opens the page and after several seconds she leaves. This could be an indicator for two things, either the page was of poor quality or it wasn’t related to the search phrase. Tricking the system is nearly impossible nowadays. The best direction the team building the website can take is to follow the rules and recommendations from other webmasters.

Content length

Depending on the type of page that is being built the content length might vary. However there are certain guidelines that should be followed. They are based on the audience, the page type and format. Some user groups like children have very little attention spans, whereas middle aged people are looking for a more complete and comprehensive read. Content length and structure should be based on user data and feedback.

Proper page structure

Page content is usually organized into several structures:

  • Bigger chunks of text should be placed inside paragraphs. (the ‘p’ tag)
  • Titles should be short and placed inside headings. (the ‘h1-6’ tag)
  • Links should be placed inside hypertext reference blocks. (hrefs)
  • Navigation should be inside a special ‘nav’ block
  • Non visible (meta) text should be placed inside the appropriate meta tags

The automated text analysis tools utilized by search engines are expecting similar structure. By failing to fit into that frame the team is risking the website not being properly analysed by search engines.

Content

Content is tightly coupled with the other aspects of the website quality. It can refer to both text, image or video. In any case it should be adjusted for the target audience, use case and niche.

Gramar

Having typos in the webpage will drive many people away and lower the overall performance of the website.

Readability

Readability score is a sophisticated metric evaluating the reading difficulty of a text. It is based on average sentence length, complexity of words used and repetitiveness. Moreover this score is then mapped to US standard grade levels, which are based on age.

Font

There are more than 200 000 fonts you can choose from. This makes the task of choosing the right one particularly hard. However there are three main things to consider when choosing it, readability, consistency and trends.
Readability of a font is a subject to personal opinion. This is why it should be a collective decision based on real user data.
Consistency will be achieved by using the same font everywhere. Moreover using several different fonts might make the website look unprofessional.

Trends that are going on in competitors or on social media might give hints on what customers are expecting to see in the website your team is building.

Laws and regulations

Laws and regulations for websites

General Data Protection Regulation (GDPR)

The GDPR is a regulation mostly applied in the EU. It defines rules for storing people’s personal data. Furthermore all websites operating in the EU are required by law to include data privacy consents for users to review and accept. They are obligated by the authorities to provide detailed information on how the users personal data is stored, how it is protected and who has access to it. Also users can request all records for him to be deleted upon he chooses to close his account or unsubscribe from a particular service.

Payment Card Industry Data Security Standard (PCI)

These are set of very strict regulations for websites that store credit card details and depending on the volume of the transactions they process the regulations might vary.

Intellectual property

The organisation might have trademark rights, patent or copyrights on the website.

Copyright

Copyright is an automatic right assigned to the creator of any piece of original work. This could be one person, multiple people or a company. Copyright allows the owner to control how their work is used.

Trademark

Trademark is a product brand symbol, which is issued by government authorities and makes the website distinguishable. Trademarks need to be re-issued depending on local regulations.

Patent

Patent is a limited duration property right relating to an invention. It is granted by the United States Patent and Trademark Office in exchange for public disclosure of the invention. Patentable materials include machines, manufactured articles, industrial processes, and chemical compositions. The duration of patent protection depends on the type of patent granted.

Testing the quality of website

These are the major website testing strategies that teams use to test their pages.

  • Functional Testing
  • Visual Testing
  • Usability Testing
  • Interface Testing
  • Compatibility Testing
  • Performance Testing
  • Security Testing

We have covered each of them in great detail in our blog post.

Conclusion

Evaluating the quality of a website is a complex task with many aspects. Having to deal with all sorts of requirements and technological challenges makes the process expensive and time consuming. However with the brutal competition in the internet website owners can’t afford to rely on flawed websites. It could hurt their reputation or reduce conversion and sales.

Investing in a well thought and properly implemented website testing strategy is essential for the success of any online project.