Victor is a full stack software engineer who loves travelling and building things. Most recently created Ewolo, a cross-platform workout logger.
What to expect in a code quality assessment

In the past year or so, I've had to do a couple of code quality assessements and I decided to make my notes public so that I am able to provide a quick reference to prospective clients.

Before we jump into looking at all the points that a code quality assessment should cover, it is important to identify the primary goals of the assessment. This is client specific and it could also include any specific questions that the client would need clarity on. Thus, the following may change depending upon the detail required but in general, here are the various aspects that a code assessment covers:

Code Quality
Readability
This is one of the first items on the list and it basically involves looking at how easily someone can jump into the code. Do the functions/variables have intuitive naming?, Does the code read consistently across the code-base?, etc.
Adaptability
This sub-aspect covers how well modularized the code is and how easy is it to add/remove features. This also includes taking a look at the effort required to upgrade and/or switch technology.
Interoperability
Most applications deal with data and this sub-aspect covers how well the data is structured and how easily can data be migrated. This also covers any other standardization that the application should follow, e.g. if it is a medical application, does it use standardized codes for medication?, etc.
Execution
Maintainability
This sub-aspect covers one of the most important features of a code-base: testing. In my opinion, testing is an integral part of software maintainability. If I were to write some code that was to only ever run once, I wouldn't need to write any tests. Anyways, here we look at what sort of tests the code-base features: unit, integration and end-to-end. End-to-end tests are very highly rated as they provide a thorough coverage of the application and its maintainability. Apart from testing, here we also look at how easy it is to maintain the application in production, e.g. is there proper logging?, what sort of knowledge is required to support the application during an outage?, etc.
Accuracy
What sort of bugs does the application come with? Does it do what it is supposed to do? For data-based applications, is the data stored correctly? How does the application respond under load?, etc.
Usability
For user-facing applications, this is an important aspect to consider. Is the application easy to use for its primary users? Does it follow accessiblity standards and/or other standardized UX?, e.g. do forms provide data validation hints?, etc.
Security

This aspect covers all security related issues such as:

  • password management: are users password hashed?
  • logging: what sort of information is logged?
  • operations: who has access to production and what can they do?
  • ssl: are web applications served over SSL?
  • hardware: are the machines running the application patched and securely setup? e.g. do web applications expose any unprotected ports?
  • data sanitization: is input data santizied?
  • dependencies: are dependencies regularly updated/audited?
Software engineering process

The software engineering process plays a key role in the health of a project. Some of the points to be evaluated are:

  • software development: does the development take place according to a standardized methodology? i.e. waterfall, agile, kanban etc.
  • documentation: is there sufficient documentation to bootstrap the project and get a development environment setup? Is there in-code documenation for critical/complex concepts?
  • bug tracking: is there an bug/issue tracking system in place?
  • version control: does the project use version control?
  • 4 eyes: is there a review of code before it gets merged into master?
  • continuous integration: are tests run on every push to master or on a regular basis?
  • continuous deployment: is the code deployed into production automatically and securely?

Note that the above list explores not just the state of the code but also it's execution aspects because they very much intertwined together, e.g. a web application that is simple, easy to read, developed according to best practices, does everything that the user requires but has security flaws such as storing user passwords in plain text and/or running on insecure machines is still classified as in a critical state requiring immediate attention!

The following is an example grading scheme that I generally use:

  • A : professional, done according to industry best practices and/or standards
  • B : good, however some small concerns and/or scope for improvement
  • C : does the job but classified as technical debt
  • D : requires immediate attention due to critical bugs and/or security issues
  • - : not in scope of current assessment

I usually grade each aspect and provide a final summary with action points. Feel free to discuss this on HN or email me directly with feedback!