Saturday, September 16, 2017

A holistic approach towards streamlining the Code Review process with real word Use Case

Recently I have joined a new team and got a new responsibility for doing a code reviews. It's not that I haven't done the code reviews in the past but this time the volumes is large, timeline is tight and it's is a regular responsibility on my shoulder along with other jobs.

Several questions popped into my head (described below) not only because I can't imagine going through each lines of code to find the code flaws but also due to thinking that what would one think if I raise any concerns on their written code as I am new to the team.
  • What are things (parameter) one should practically cover while doing code reviews ?
  • Is there any scope for automation in the code review process?
  • What is the current process and tool sets used for code review process?
  • What should be the approach to share the code review recommendations / suggestions so that no one would get offended in any way? 
  • What are the best practices one should follow while doing the code review ? 

Although I had some knowledge about the answers of the above questions but it didn't stop me from peeping in the internet. As a results of this, I have got a good insight about the process and approach that could be used for Code Reviews. Therefore I am writing this piece today to share the holistic approach towards streamlining the code review process that one could follow.

Let's start our journey!!
  1. Identifying code review parameters
  2. Identifying the scope for automation in code review process
  3. Gap Match Analysis & accordingly taking action
  4. Right Approach of sharing code review recommendation / suggestion
  5. Important Pointer to keep in mind while doing code review
We will discuss each of these below in details.


1. Identifying code review parameters

From this section, one would get an idea about the different parameters with some examples that one should consider while doing the code review.
  • Coding Conventions, Standard, Formatting & Best Practices 
    • Source files are encoded in UTF-8.
    • Braces are used with if, else, for, do and while statements, even when the body is empty or contains only a single statement.
    • Class and member modifiers, when present, appear in the order recommended by the Java Language Specification: public, protected, private, abstract, default, static, final, transient, volatile, synchronized, native, strictfp.
    • Parameter & Local variable names are written in lowerCamelCase.
  • Bad Practices
    • Dead code - unused local variables, parameters and private methods
    • Possible bugs - empty try/catch/finally/switch statements
    • Over-complicated expressions - unnecessary if statements, for loops that could be while loops
    • Unnecessary return, final modifier, parentheses
    • Redundant or duplicate code
  • Code Coverage
    Code coverage tell us how much of one's code is covered (i.e. executed) when automated test suits / test cases gets executed. The more the code coverage percentage the better. Various tools could be used for finding out the code coverage like Cobertura, EMMA, JaCoCo.
  • Architecture  
    • Use of appropriate Design Pattern where applicable like Singleton, Factory, Circuit Breaker etc
    • Separation Of Concern (SoC) - Layered and Tier based architecture like Presentation, Business and Data Layer 
  • Object-Oriented Analysis and Design (OOAD) Principles
    Check if S.O.L.I.D principle are followed or not. This principle is given by Robert C. Martin, popularly known as Uncle Bob.
    • Single Responsibility Principle (SRS): Do not place more than one responsibility into a single class or function, refactor into separate classes and functions.
    • Open Closed Principle:  Objects or entities should be open for extension, but closed for modification that means while adding new functionalities, existing code should not be modified. 
    • Liskov substitution Principle: Every subclass class can be used as substitute for their parent class.
    • Interface segregation Principle(ISP): A client should never be forced to implement an interface that it doesn't use or clients shouldn't be forced to depend on methods they do not use. It means that one should not create lengthy interfaces, instead of that one should split them into smaller interfaces based on the functionality.
    • Dependency Injection Principle: Instead of hard coding the dependencies inject them for example avoid using new operator for creating objects.  
  • Functional Requirement
    When something comes up in the code review phase then it's intended that it meets the functional requirement  specified in the user stories or requirement document. But there is no harm in checking it once again. 
  • Non Functional Requirement 
    • Re-usability 
      • Stick to DRY (Do not Repeat Yourself) principle
      • Look for re-useful classless, functions, services, generic component
    • Security Vulnerabilities
      • SQL Injection
      • Cross Site Scripting 
      • Broken Authentication and Session Management
      • Cross Site Request Forgery
    • Performance
      • Use of appropriate data types and Collection Classes for example when to use StringBuffer vs StringBuilder, ArrayList vs LinkedList etc
      •  Parallel processing vs serial processing 
      • Caching solution 
      • Lazy Loading 
      • Check if performance tuning techniques are used or not
      • Response time of a web page
    • Maintainability 
      • Readability - Self explanatory code
      • Easy to test and debug 
      • Configurable - configuration details are passed dynamically from outside either using command line argument or using application properties file. 

Conclusion 

The above code reviews parameters are not not exhaustive, but merely provides a direction to the code reviewer to conduct effective code reviews and deliver good quality code. If one can think of any other parameter that must be part of this list please share it using comments I will include those in the article. 


2. Identifying the scope for automation in code review process

In this Agile world, one should focus towards the smart work rather than hard work. I mean use the right thing (people/tools/process) for the right task at right time.  Some questions to help one decide the right approach:
  • Why one would want to waste their precious time when we have great static and dynamic code analyzer tools that would automatically generate reports according to our customized rule sets?

    For reference, providing details of some tools and their primary focus area:
    • PMD : Bad Practices, explore more about PMD
    • Checkstyle: Conventions
    • FindBugs: Potential Bugs 
    • HP Fortify: Security Vulnerabilities
    • JoCoCo, EMMA, Cobertura: Code coverage 
    • SonarQube: Continuous Inspection, can integrate various static& dynamic code analyzer tools
  • Is there any tools available in the market that would scrawl through our code and predict whether the right architecture (Design Pattern, SoC), OOAD principles (S.O.L.I.D.) has been followed in the application?
    The answer would be straight no. (see here is the need of Human Expertise)
  • If one integrates the right tools with CI server then defect will surface at right time i.e. development time.

Conclusion:

Use your rational and use the right thing (people/tool/process) for right task at right time. Don't waste time on things that could be automated or there exist out of box solution in the market. Wherever there is scope of automation, don't hesitate to do it. 

3. Gap Match Analysis & accordingly taking action 

Until and unless one can't find the existing tools and processes in place for doing the code review one can't suggest and take any corrective measures to improve current ongoing process. Now we have equipped ourselves with right knowledge (i.e different code reviews parameters and how to judicially use people, process and tools)that would helps us in doing gap analysis and accordingly take required actions.  Few points one should keep in mind:
  • Risk assessment - assess what's going wrong (gap) and it's associated risk
  • Risk Prioritization - categorize risk and accordingly solve risks 
  • Prioritize Action Items - don't try to incorporate all the things at one go otherwise it may hamper the release. 
Let's take one real world, industry standard use case to understand this:
  • Gap Match Analysis 
    • Existing Process and Tools 
      • SonarQube Jenkins Integration in place with default Quality Profiles i.e. Sonar Way
      • Sonar Way is not configured properly, just have bare minimum security related rule sets 
      • Only Blocker, Critical issues that's generated using SonarQube are currently resolved. 
      • Major and Minor issues are passed from quality gate without fixing it 
      • No code review process in place 
      • Code coverage tool in place but analyzed properly if it's the right one 
    • Desired State
      • Should have customized quality profile having best flavors of the Sonar Way, PMD, Checkstyle etc
      • Based on enterprise application, choose the right code coverage tool 
      • Code should not be promoted to QA, Staging until it passes all the quality checks
      • Level 1 & Level 2 Code review should in place  
  • Action Items
    • Analyze the tool one by one eg. PMD tool and create custom rule sets
    • Integrate PMD in SonarQube to create Quality Profile 
    • Configure Sonar Way rule sets 
    • Try to fix all Major issues 
    • Then Minor ones 
    • Then take architecture & OOAD related recommendation as the maintenance tasks 

Conclusion: 

Understand the current system, find gaps, match and accordingly prioritize the actions items otherwise it may hamper one's release.

4. Right Approach of sharing code review recommendation / suggestion 

I have kept this point intentionally because it's very essential and important to create a positive, constructive, collaborative work environment around us. We don't want to offend any one in any way even unintentionally while we share our code review recommendation and suggestion. Here are some points:
  • Don't share the code review recommendation/ suggestion without stating the issues and corresponding consequences of the issue
  • Don't force any one to rewrite or completely change their written code as it might hurt their ego. Instead share the missing design pattern, OOAD principals and ask for their opinion regarding the current implemented approach and desired approach.
  • Educate the developers by sharing various online resources so that they might not repeat the same mistake twice.
  • Make the developer realize so that they take this exercise as an opportunity to learn/discuss best practices.

5. Important pointer to keep in mind while doing code review

References

That it! We are set to go to do Code Reviews. I am open to any kind of suggestions. Please feel free to share it through comments. Happy Learning!! 

No comments:

Post a Comment