Home Software Consultant

Experience Report - From Zero to 500

2010-09-06T18:11:05+00:00

Every experience teaches something. It is upto us to imbibe the lessons. Sometimes worrying about success and failure takes away the of joy of just doing. I started working on the current project in September 2009. The technology stack is .net 3.5, Windows Forms, SQL Server 2008 and LINQ-To-SQL. The Project Manager hired me. When I joined the team, the software had been in development for 4-5 months. All of us are from different regions(Middle East,Europe) and work remotely. We communicate by using software like Skype, TeamViewer, and of course, email. We use a wiki for documentation, and a web-based bugtracker. Initially, I was given the task of understanding the software, and writing the documentation. I found this a bit strange. But, it turned out to be a good way to understand the software. Also, I was happy that a wiki was used for the documentation. My many doubts were clarified either by the PM or, the client(end-user). Soon, I found there were many problems - 1) The wiki had all the information related to the system to be developed, and tutorials for the users. But, there were many ambiguities in the system documentation. Also, it was a bit outdated, and not in sync with the state of the software.Most of the domain knowledge was with the PM. And, one of the reasons, he had asked me to do the documentation was to capture that knowledge. 2) The codebase had huge amount of duplication. There were no tests. 3) There was an architecture - 3 logical layers - windowsforms , business layer and data access layer. But, there was lot of logic which was in the forms. 4) All the features which had been implemented were incomplete and had bugs. 5) One of the developers involved had this strategy - he would fix the bug assigned to him and close it.He did not check all the scenarios. So, other bugs cropped up. The software was unstable. I felt very uneasy about this situation. 6) We were using LINQ-to-SQL, but did not have good understanding of this technology. This was creating problems like too many queries getting fired. We realized later when we started facing performance issues. 7) The database was not well designed. There were some columns in some tables not being used. Most of the columns had “allow nulls”. It was not under version control. We had views which were doing lot of calculations. The PM knew about these issues, and was eager to overcome them. One thing I appreciate about him is his persistence and eagerness to solve problems. We started talking about testing, and he suggested to write functional tests. But, we found that the Devexpress controls did not support UI testing using Microsoft framework. Then, I told him we should start writing unit tests, and he was open to this idea. I also suggested that we use MVP(Model View Presenter), and take all the logic out of forms. He agreed, and asked me to do it on a couple of forms. I started refactoring, and adding tests. Initially, I was the only one running the tests. Sometimes I found compile errors in the test project which surprised me! I realized that the project has not been added as part of the solution!. Initially, many of the tests were integration tests and so the database was involved. We created a script to create the database and, separate data files(sql) for default data. Then, wrote a script which created the database, build the projects and ran the tests. We used NDBUnit for input data to be used for integration tests. This required creation of xml files. Slowly the test coverage increased, and it started catching some bugs. Then, I started writing tests for bugs too. I realized some things were common on all forms. This, was a eureka! moment, seeing patterns among the mass of duplicate code or, the representation of similar ideas. This led to creation of base view and base presenter. This eliminated a lot of duplication. Soon, we had 2 new members in the team. They were eager to learn and write tests. This helped in speeding up of writing tests. Today, we have about 500+ tests. Also, we started a QA page which listed points to be checked for each module. This page helped everyone, including the QA team. The state of the software today is - 1) The software is in better stable condition. Most of the business logic is now in good condition, with very few bugs. 2) The user interface has still some issues. 3) We have many tests, but some of them are unreadable, look very ugly. We have have used BDD style of language for some of the tests in given\when\then style. 4) Linq-to-Sql Entities - They had only set\get properties. We had committed violation of "Law of Demeter" throughout the codebase. Instead of asking Customer.IsYourName(x), we had code (if customer.Name == x). So, we did some refactoring, and added more stuff to the entities like validation rules, moved functions which acted upon "entity data" to entity. 5) A lot of duplication still exists, which can be removed. 6) We have an acceptable performance metrics page for different scenarios. We have tested a lot, and improved the linq-sql queries and the stored procedures. But a lot of work still needs to be done. 7) The software, is currently in testing stage. The users are testing the software with live data for some part of the system. Some concluding observations - 1) The architecture of a system can evolve. And, an evolving architecture solves the problems faced, so everyone in the team is able to understand the reasoning behind the changes. 2) Some of the decisions related to architecture could have been made early. For example, the application is for multiple users, and these users are located in different locations(countries). So, it seems obvious that some thought should/could have been given - Where will the database be located? Will we need replication to handle some scenarios? Should we develop a desktop or a web application? Also, it seems there is lack of literature for such scenarios. What are the best practices which have succeeded? Or, the way we have approached, taking decisions much later like for replication. In fact, once replication was implemented, we had to change all primary keys to guid from int. 3) A remote set of developers can work well. But, I miss pair-programming. I tried it a couple of times, but due to slowness of broadband, it did not work out. 4) The unit\integration tests gave confidence to the developers as well as the stakeholders. The stakeholders had become jittery, and were reluctant to request changes, since after every change old problems cropped up. But, having regression tests helped to track such bugs and prevented from re-appearing. 5) The documentation of various aspects of the project on the wiki has helped new developers and the QA process. But, the issue is of keeping it in sync with the changes being made to the software. And, I think we are not always in sync. 6) The tests have really helped us. But, I wonder if we are writing too many tests. The test coverage is around 60 %. I feel, since we have remote developers, and also turnover in the team, these tests serve as a good safety net. 7) We need a continuous integration server. The developers are not disciplined enough, and don't run tests before every commit. This means, if tests are failing then sometimes it goes unnoticed. We deliver software to client and QA every few days. But, building a setup is a manual process, and done by a developer on his local machine. So, sometimes local changes creep into the build.