Solve your problems!

They say that you have to love yourself first to be able to love other.

I believe that!

So why do I only attempt to solve other people problems and not solve mine?

Most professions are about solving other people problems. Software Development is mainly about that. But have you ever asked yourself what you can do or have been doing to solve your problems?

If you have a problem, there is a HUGE possibility that other people have it too.

Why not create a solution for that problem and if its a good approach, sell it to everyone out there that feels the same.

By doing that, You are the FIRST customer/user of that solution. You’ll feel and think directly before anyone else if its good or not. And even if it’s only good for you, and don’t sell anything, even like that, You solved Your problem! You are happy with the solution! With that peace of mind, You have one less problem to think about it and you can focus more in creating solutions for other people problems.

Try it! Think a little about your problems!

Parameterized Unit Tests with JUnit

The current supported parameterized tests in JUnit seemed a little bit confusing for me and tricky to implement, at least compared to the NUnit approach. Below I will show a simple test implementation that asserts the result of the division of a number by another.

The NUnit approach:

using System;

using NUnit.Framework;

namespace JBrisk
{
    [TestFixture]
    public class DivideClassTests
    {
        [TestCase(12, 3, 4)]
        [TestCase(12, 2, 6)]
        [TestCase(12, 4, 3)]
        public void Divide(int n, int d, int q)
        {
            Assert.AreEqual(q, n / d);
        }
    }
}

NICE, NUNIT!

The JUnit approach:

package org.jbrisk.tests;

import java.util.Arrays;
import java.util.Collection;

import junit.framework.Assert;

import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
import org.junit.runners.Parameterized.Parameters;

@RunWith(Parameterized.class)
public class JUnitDivideClassTests {

	@Parameters
	public static Collection<Object[]> data() {

		return Arrays.asList(new Object[][] { { 12, 3, 4 }, { 12, 2, 6}, { 12, 4, 3 }});
	}

	private int n;
	private int d;
	private int q;

	public JUnitDivideClassTests(int n, int d, int q) {

		this.n = n;
		this.d = d;
		this.q = q;
	}

	@Test
	public void test() {

		Assert.assertEquals(q, n / d);
	}
}

JUNIT, Y U NO MAKE IT SIMPLE?

Don’t get me wrong, I love JUnit and use it daily at my projects, but this parameterized tests feature could be more simple. The reasons I did not like:

  • It needs more 20 lines of code for just a simple test, imagine a more complicated ones.
  • WHAT? I have to create a different class per parameterized test? This breaks a commonly used pattern about “One Test class per Class to Test”.
  • All the additional methods, fields, constructor, reduces the readability and maintainability of the code.

For those out there that do Unit Tests, specially the few ones on their teams/companies that do, you know how hard it is to maintain a lot of tests and even more hard to convince someone to start doing it. Less code (as long as it does not compromise readability) is better!

Well, I didn’t like it! And what my parents taught me a long time ago was:

When you don’t like something, change it!!!

And so I did! Here is the same test with JUnit, but using the @JBriskTestRunner and @ParamTest annotations implemented on my JBrisk project:

package org.jbrisk.tests;

import junit.framework.Assert;

import org.junit.runner.RunWith;

@RunWith(JBriskTestRunner.class)
public class DivideClassTests {


	@ParamTest({ @Values({ "12", "3", "4" }), @Values({ "12", "2", "6" }), @Values({ "12", "4", "3" }) })
	public void test(int q, int n, int d) {

		Assert.assertEquals(q, n / d);
	}
}

The good

  • Reduced the number of lines of code.
  • More readability.
  • Easier to maintain.
  • The implemented runner validates if the length of supplied arguments match the expected parameters length, and also if the supplied values can be converted to the expected type.
  • Since the JBriskTestRunner extends the BlockJUnit4ClassRunner, external tools can execute the parameterized tests (@ParamTest), non-parameterized tests (@Test) and all the other tests supported by it. And also, you don’t have to change your existing test class structure, you only have to add the @RunWith(JBriskTestRunner.class) annotation. Below is a an extended version of the class above and a screenshot of the Eclipse JUnit Test Runner executing its tests:
    package org.jbrisk.tests;
    
    import junit.framework.Assert;
    
    import org.junit.Test;
    import org.junit.runner.RunWith;
    
    @RunWith(JBriskTestRunner.class)
    public class DivideClassTests {
    
    	@Test
    	public void normalTest() {
    	
    		Assert.assertEquals(1, 1);
    	}
    	
    	@Test(expected = NullPointerException.class)
    	public void testWithExpectedException() {
    		
    		throw new NullPointerException();		
    	}
    	
    	@ParamTest({ @Values({ "12", "3", "4" }), @Values({ "12", "2", "6" }), @Values({ "12", "4", "3" }) })
    	public void divideTest(int n, int d, int q) {
    
    		Assert.assertEquals(q, n / d);
    	}
    }
    

    JBrisk Parameterized Tests on Eclipse

    Notice how the Test Runner view above shows all the 3 runs of the parameterized test, showing also the arguments supplied to each one.

The bad

  • As stated here on the java docs, annotations do not support Object arrays, so, to support tests that receive arguments from different types, I had to resort to an “untyped” String array.
  • But fear not, if the JBriskTestRunner cannot convert the supplied argument to the correct parameter type, it will throw a descriptive error, something like: “The value “StringValue” supplied for the argument at idx “1” cannot be parsed to int! Check the inner exception for details.”

  • Java annotations only support primitive types, Class and enums (or arrays of the mentioned). JBriskTestRunner currently supports:
    • byte/Byte
    • char/Character
    • boolean/Boolean
    • short/Short
    • int/Int
    • long/Long
    • float/Float
    • double/Double
    • String
    • Class
  • You need to add another reference to the JBrisk project. Its not that bad because the JBrisk project does not have dependencies and supports Maven! 😀

One thing i do want to mention is that the JUnit contributors did an AWESOME job on the Runners object model. It was really easy for me to implement this feature because of that. Thank you! 😀

Easier Unit Tests with JNarcissus

Updated after major refactor on JNarcissus

For those that write a lot of Unit Tests in Java (yeeeeiiii, me included), i’ve just created an open source project that could interest you folks out there.

Its name is JNarcissus

Quoting Wikipedia:

in Greek mythology, Narcissus was a hunter from the territory of Thespiae in Boeotia who was renowned for his beauty. He was exceptionally proud, in that he disdained those who loved him. Nemesis saw this and attracted Narcissus to a pool where he saw his own reflection in the waters and fell in love with it, not realizing it was merely an image. Unable to leave the beauty of his reflection, Narcissus died.

JNarcissus won’t be hunting your bugs, but IT WILL fall in love with your objects. :D. Meaning that it will always notify you about things that should not happen.

Lets see an example. Lets say i have to test the following class:

package org.jnarcissus.core.sample;

public class JustAnotherTestClass {

	private String textField1;

	private String textField2;

	/**
	 * Returns the textField1 value.
	 * 
	 * @return textField1 value.
	 */
	public String getTextField1() {
		return textField1;
	}

	/**
	 * Sets the textField1 value.
	 * 
	 * @param textField1
	 *            New value.
	 */
	public void setTextField1(String textField1) {
		this.textField1 = textField1;
	}

	/**
	 * Returns the textField2 value.
	 * 
	 * @return textField2 value.
	 */
	public String getTextField2() {
		return textField1;
	}

	/**
	 * Sets the textField2 value.
	 * 
	 * @param textField2
	 *            New value.
	 */
	public void setTextField2(String textField2) {
		this.textField1 = textField2;
	}
}

Notice how the “copy-paste” demon got me tricked there (the getTextField2 and setTextField2 methods are using the wrong field). And lets face it, no matter how many year have passed, we always do copy-paste :D. If we follow the common Unit Test guideline (test just one requisite/feature per test) it would be really hard to get this bug. Lets see:

package org.jnarcissus.core.sample;

import org.junit.Assert;
import org.junit.Test;

public class JustAnotherTestClassTests {

	@Test
	public void setTextField1_validValue_getTextField1ReturnsSuppliedValue() {
		
		JustAnotherTestClass obj = new JustAnotherTestClass();
		
		Assert.assertNull(obj.getTextField1());
		
		obj.setTextField1("SomeValue");
		
		Assert.assertEquals("SomeValue", obj.getTextField1());
		
		obj.setTextField1(null);
		
		Assert.assertNull(obj.getTextField1());
	}
	
	@Test
	public void setTextField1_validValue_getTextField2ReturnsSuppliedValue() {
		
		JustAnotherTestClass obj = new JustAnotherTestClass();
		
		Assert.assertNull(obj.getTextField2());
		
		obj.setTextField2("SomeValue");
		
		Assert.assertEquals("SomeValue", obj.getTextField2());
		
		obj.setTextField2(null);
		
		Assert.assertNull(obj.getTextField2());
	}
}

Both tests will pass, because they test only one feature. And I know this is the Unit Testing philosophy, but you can also say is one of its weakness. For that (and a lot of other cases) JNarcissus was made! Lets see how we solve this problem with JNarcissus:

package org.jnarcissus.core.sample;

import org.jnarcissus.core.JNarcissus;
import org.junit.Assert;
import org.junit.Test;

public class JustAnotherTestClassTests {
	
@Test
	public void setTextField1_validValueAndAssertsWithJNarcissus_getTextField1ReturnsSuppliedValue() {

		JustAnotherTestClass obj = JNarcissus.create(JustAnotherTestClass.class);

		JNarcissus.assertNull(obj.getTextField1());
		
		JNarcissus.assertNull(obj.getTextField2());
		
		obj.setTextField1("SomeValue");

		JNarcissus.assertEquals("SomeValue", obj.getTextField1());

		obj.setTextField1(null);

		JNarcissus.assertNull(obj.getTextField1()).andPreviousAsserts();
	}

	@Test
	public void setTextField2_validValueAndAssertsWithJNarcissus_getTextField2ReturnsSuppliedValue() {

		JustAnotherTestClass obj = JNarcissus.create(JustAnotherTestClass.class);

		JNarcissus.assertNull(obj.getTextField1());
		
		JNarcissus.assertNull(obj.getTextField2());

		obj.setTextField2("SomeValue");

		JNarcissus.assertEquals("SomeValue", obj.getTextField2()).andPreviousAsserts();

		obj.setTextField2(null);

		JNarcissus.assertNull(obj.getTextField2()).andPreviousAsserts();
	}
}

Executing the code now throws the following error:

java.lang.AssertionError: Called: org.jnarcissus.core.sample.JustAnotherTestClass.getTextField1()
Expected: null
Actual: “SomeValue”

NICE! Now the tests fail!!! Aaaaa the red color… i don’t know if i love more making tests fail or making tests pass! 🙂 But how does the magic happens? Well, the first important line is this:

JustAnotherTestClass obj = JNarcissus.create(JustAnotherTestClass.class);

This will create an instance of the supplied class but, this instance will be special. It will have all its methods calls monitored by JNarcissus. But JNarcissus does not know yet what to monitor. Normally on a test, we ASSERT what we expect to be true (A value equals to an expected value, a condition returning false, etc). Look at the following line:

JNarcissus.assertNull(obj.getTextField1());

With the line above, we are providing the information JNarcissus needs to monitor our instance. It means something like this:

JNarcissus, assert that the obj.getTextField1() method returns null and, EXCEPT I SAY OTHERWISE, all subsequent calls to the obj.getTextField1() method should return null. ALWAYS.

As our test continues, we may need to update the information to be monitored. Look at the next two lines below:

JNarcissus.assertNull(obj.getTextField2());
obj.setTextField2("SomeValue");

The first line only makes the same as the previous assert but now to the obj.getTextField2() method. But the second line is calling the obj.setTextFielld2 method with a value. This means that the expected return value from the obj.getTextField2() method should NOT BE NULL anymore. It should be equal to the one supplied to the obj.setTextFielld2 method. We need to update this information on JNarcissus (Remember the “EXCEPT I SAY OTHERWISE” condition?). We do this simply calling another assert on the same method:

JNarcissus.assertEquals("SomeValue", obj.getTextField2()).andPreviousAsserts();

To JNarcissus, this means something like this:

JNarcissus, assert that the obj.getTextField2() method returns a value equals to “SomeValue” and, EXCEPT I SAY OTHERWISE, all subsequent calls to the obj.getTextField2() method should return a value equals to “SomeValue”. ALWAYS. And also, execute all the previous asserts i made until now.

The .andPreviousAsserts() method executes all the asserts the code made before. And of course, it will only execute the most recent assert for a given method. I like to call this feature Assertive Memory. Because basically, is a memory of all the current asserts made to an object. If we were to look at the assertive memory state through the test, it would look like this:

obj.getTextField1() obj.getTextField2()
Before setting value Assert is null Assert is null
After setting value Assert is null Assert is equals to “SomeValue”

When you are updating the information (lets read it, the assert you want to keep stored) about a method, JNarcissus does not only take in consideration the method witch you are calling, but also the arguments you use. For example, lets say we have a method that accepts an argument, and that we want to monitor different return values from it:

	
/**
* A method with an argument that impacts the returned value.
* 
* @param number
*            Number argument.
* @return The supplied number to String.
*/
public String numberToString(int number) {

if (number == 0)
	return "Zero";
else if (number == 1)
        return "One";
else
	return "Don't know";
}

We can monitor methods with arguments the same way we do with methods without arguments:

	
@Test
	public void numberToString_returnsCorrectValue() {
		
		JustAnotherTestClass obj = JNarcissus.create(JustAnotherTestClass.class);
		
		JNarcissus.assertEquals("Zero", obj.numberToString(0));
		JNarcissus.assertEquals("One", obj.numberToString(1));
		JNarcissus.assertEquals("Don'tknow", obj.numberToString(2));
	}

What this means to JNarcissus:

JNarcissus, assert that the obj.numberToString method called with the argument 0 returns a value equals to “Zero” and, EXCEPT I SAY OTHERWISE, all subsequent calls to the obj.numberToString method called with the argument 0 should return a value equals to “Zero”. ALWAYS.
JNarcissus, assert that the obj.numberToString method called with the argument 1 returns a value equals to “One” and, EXCEPT I SAY OTHERWISE, all subsequent calls to the obj.numberToString method called with the argument 1 should return a value equals to “One”. ALWAYS.
JNarcissus, assert that the obj.numberToString method called with the argument 2 returns a value equals to “Don’tKnow” and, EXCEPT I SAY OTHERWISE, all subsequent calls to the obj.numberToString method called with the argument 2 should return a value equals to “Don’tKnow”. ALWAYS.

Executing this test will throw the error:

java.lang.AssertionError: org.jnarcissus.core.sample.JustAnotherTestClass.numberToString(2)
Expected: “Don’tknow”
Actual: “Don’t know”

I’ve made this error on purpose, just to show that JNarcissus monitors method call with arguments also. We can easily fix the test by correct the string “Don’tknow” to “Don’t know”.

Why all the trouble

Well, i wrote JNarcissus because in some point while doing Unit Tests, I noticed that in some more complicated cases, I was doing the same asserts very often. JNarcissus main reason to exist was to erase du/tri/quadru/####plicated “asserts” code. The other main reason was to understand how Mocking frameworks are made :D. You see, under the hood, JNarcissus uses the same technique most of the Mocking Frameworks use, that is: In runtime, generate a class that extends the class you want to mock, and direct all method calls made to that instance to the mocking framework stubs. In fact, the first version of JNarcissus was made based on the GREAT AND AWESOME (I really, really like it) mocking framework Mockito. You can see the influence from Mocking frameworks, since in JNarcissus you can actually assert information about the monitored methods:

JNarcissus.assertMethod(obj, new CallCountMatcher(1)).numberToString(0);
JNarcissus.assertMethod(obj, new CallCountMatcher(new GreaterOrEqual<Integer>(1))).numberToString(0);

The first line above will assert that the number of calls made to the obj.numberToString(0) method was exactly 1. The second line shows just an overload to the CallCountMatcher constructor that receives a Matcher object (yes, from the awesome library Hamcrest). that asserts that the number of calls to the obj.numberToString(0) method is Greater or Equal to 1. This is a good feature when you use dependency injection on your tests, and does not want to create a mock, but simply verify that a method was called.

Internally, JNarcissus uses the Hamcrest matchers to do the asserts. So you can also do something like this:

// These two lines means the same thing
JNarcissus.assertNull(obj.getTextField1());
JNarcissus.assertThat(obj.getTextField1(), CoreMatchers.nullValue());

Whats next?

Well, JNarcissus it still in its beggining, but i already use it for making Unit Tests at my projects. I plan to port JNarcissus to .NET also, but still don’t know when it will be. Like most of the mocking and dependency injection frameworks that use the “derived class technique” or “proxy class technique”, JNarcissus has some limitations. It cannot monitor final methods or classes. I will try to overcome this limitation in the future, but only if there is the demand to.

The proud Developer…

(this is a post inspired in real people, but does not represent anyone in particular)

Hi,

I’m the Proud Developer… I’m really proud and happy with my work! I love working as a Developer, providing and sometimes creating things that did not exist before or at least making existing things better. I just love to deliver software that its not only usable, but easy to use, fast, secure, responsive, scalable and well made! To get to that, i make a clean and organized code, always refactor when needed and possible (hey, i’m not perfect, i don’t get it right always the first time) and try to document everything i can (and time allows to). Since i’m also humble, i tend to implement Tests (especially automated Unit Tests) to most of the code i make. At least this way i can have peace of mind that when i change something (implementing new things or refactoring old ones) i don’t break any functionality.

And i do all of this with a PIECE OF SH$# COMPUTER! It’s slow, our build times are huge, the committed memory its always surpassing the amount of free memory so its always doing swaps with the virtual memory (a lot slower) but hey, i’m a proud developer.

In fact, i’m so proud that i don’t really care that my build takes 4 minutes, and that every iteraction (code, compile, test) it takes almost 5 minutes.

I know that 5 minutes its a lot, specially knowing somethings about response time. Quoting Jakob Nielsen from http://www.useit.com/alertbox/response-times.html:

Response-Time Limits

The 3 response-time limits are the same today as when I wrote about them in 1993 (based on 40-year-old research by human factors pioneers):

  • 0.1 seconds gives the feeling of instantaneous response — that is, the outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation (direct manipulation is one of the key GUI techniques to increase user engagement and control — for more about it, see our Principles of Interface Design seminar).
  • 1 second keeps the user’s flow of thought seamless. Users can sense a delay, and thus know the computer is generating the outcome, but they still feel in control of the overall experience and that they’re moving freely rather than waiting on the computer. This degree of responsiveness is needed for good navigation.
  • 10 seconds keeps the user’s attention. From 1–10 seconds, users definitely feel at the mercy of the computer and wish it was faster, but they can handle it. After 10 seconds, they start thinking about other things, making it harder to get their brains back on track once the computer finally does respond.

I’m so proud of my profession that i do everything i can to achieve the best response time possible (lower than 1 second) on the software that i make, but i don’t mind at all to wait for 5 minutes to test some code i’m doing for that particular software. Hey, don’t get me wrong, i wish i could have a iteration (code, compile, test) that would take less than 1 second, but i’m not a final user, i’m a developer :D. I can take this, i want to make responsive software but don’t really mind that the software i use to make it takes 50x more to respond to me.

Another interesting fact. I’m always concerned about memory leaks and low memory consumption. I tend to create objects with a small memory footprint, do a proper management of resources, lazy-loading when possible, etc. I confess, its a bit hard to deliver that quality of software with the tight project times and everything, but hey, don’t forget i’m the proud developer… I work over times even without getting paid. As soon as I open my development environment my computer memory start to scream for some free space, but i don’t mind. I’m a developer… and a proud one. I don’t dare to put me in the same position as a final user (the ones i’m doing software for). I’m a DEVELOPER, with all the capital letters. My software won’t have high memory usage problems, but… if the software i use to develop has those, well, i can take it…

WHAT THE HECK ITS THAT? I’m I insane? I’m starting to be a hypocrite? How can I expect to achieve something or even accept something from someone (the demand from the user for responsive software, low memory usage, security, etc) if I don’t want, demand or simply don’t mind not having for myself.

WAKE UP FELLOW DEVELOPERS! If your build process takes more than 1 second to respond, TALK WITH YOUR MANAGER! If you don’t have one, fix the problem yourself. I don’t really get it why the stability, responsiveness, performance of the developer machine, build process or any part of the development process tends to be in the LAST place (along with testing :D) on the list of priorities of the Management Team and (god no) the mind of developers itself. Here is how it happens:

Person in Charge: People, we have a tight schedule, we must deliver this in time. Also, the requirements and expectations are way high, so the application must work well under stress, be fast, scalable and reliable. Can we make it on time?

Developer thinking: (Man, it will be hard to get things on time, specially since my computer its so slow, but hey, i’m a proud developer, i won’t show any weakness.)

Developer says: Yeah, i think we can.

One advice for the developers: Bug your Manager EVERY DAY, MORE THAN ONCE A DAY, about things that are slow and don’t help you make your work. Your computer its slow? Bug him. Your computer does not have free memory? Bug him. The compilation process takes to long? Bug him. If you happen to work with agile methods, speed its crucial… Don’t ignore the time you waste on things that are not development itself (compiling, waiting for the application to load, checking/committing things to the repository, etc). If you can, try to measure these wasted times and do the math of how much time you waste in a day, a week and a month. Multiply these times with the number of developers in the project you work, show the results to your manager and see how big your eyes will get with the surprised expression you will make! 😉

One advice for the management team: Ask your developers EVERY DAY, MORE THAN ONCE A DAY, about things that are slow and don’t help them make their work. Their computer its slow? Fix it. Their computer does not have free memory? Fix it. The compilation process takes to long? Fix it. If you happen to work with agile methods, speed its crucial… Don’t ignore the time they waste on things that are not development itself (compiling, waiting for the application to load, checking/committing things to the repository, etc). If you can, try to measure these wasted times and do the math of how much time they waste in a day, a week and a month. Multiply these times with the number of developers in the project you work, show the results to them and see how big your eyes will get with the surprised expression you will make! 😉

The quality of the development environment should be EVERYONE TOP priority. It pains me to see that its the last. How do you expect to make things good and fast if you waste precious time with things that should not be? Is it really that hard to comprehend?

I will be really glad to see the opinions of other developers, managers, CEO’s, etc. Let’s make software development mode productive.

NAnt and Unit Testing: Create unit tests for NAnt projects – build files – custom tasks/functions

Since the project i’m working has a LOT of build NAnt scritps, and we are at a crucial time in our development, every change at the build scripts itself, custom made Functions and Tasks, could(read it can) break a existing piece of script, create a bug, harm or prevent a developer work, keep the QA team without a updated deploy etc.

To prevent this i thought: HEY, lets unit test our build process! 😀

We are trying to increase our Unit Tests coverage but one place that we did not had any was at the build process itself. I searched everywhere how to do this but did not found a proper answer. I downloaded the NAnt source code and after some investigation got to the following conclusion:

You can execute NAnt inside a #PLACE_YOUR_FAVORITE_UNIT_TESTING_FRAMEWORK_FIRST_LETTER_HERE#_Unit test or any .NET application just by doing the following:

* Choose your unit tests framework.
* Add a reference to your unit tests framework of choice assemblies.
* Add a reference to the NAnt.Core assembly (has all the important classes like Project, Task, etc)
* Add a reference to the NAnt.Win32Tasks (required because it uses the readregistry task inside this assembly to retrieve the .NET desired framework configuration at the registry).
* Add a reference to the additional assemblies you need (Your custom task assembly, other tasks you use on your project under task, etc…)

* Add to the App.config of your unit tests project the nant required configuration. I just got the corresponding framework (we here use the .NET 4.0) config from the nant.exe.config file that ships with the latest nant version. I added the other configuration sections also (log, assembly probe, etc).

<?xml version="1.0"?>
<configuration>
  <!-- Leave this alone. Sets up configsectionhandler section -->
  <configSections>
    <section name="nant" type="NAnt.Core.ConfigurationSection, NAnt.Core" />
    <section name="log4net" type="System.Configuration.IgnoreSectionHandler" />
  </configSections>
  <appSettings>
    <!-- Used to indicate the location of the cache folder for shadow files -->
    <add key="shadowfiles.path" value="%temp%\nunit20\ShadowCopyCache" />
    <!-- Used to indicate that NAnt should shadow copy files in a cache folder near the executable -->
    <add key="nant.shadowfiles" value="False" />
    <!-- Used to indicate if cached files should be deleted when done running-->
    <add key="nant.shadowfiles.cleanup" value="False" />
    <!-- To enable internal log4net logging, uncomment the next line -->
    <!-- <add key="log4net.Internal.Debug" value="true"/> -->
  </appSettings>
  <!-- nant config settings -->
  <nant>
    <frameworks>
      <platform name="win32" default="auto">
        <task-assemblies>
          <!-- include NAnt task assemblies -->
          <include name="*Tasks.dll" />
          <!-- include NAnt test assemblies -->
          <include name="*Tests.dll" />
          <!-- include framework-neutral assemblies -->
          <include name="extensions/common/neutral/**/*.dll" />
          <!-- exclude Microsoft.NET specific task assembly -->
          <exclude name="NAnt.MSNetTasks.dll" />
          <!-- exclude Microsoft.NET specific test assembly -->
          <exclude name="NAnt.MSNet.Tests.dll" />
        </task-assemblies>
        <framework
               name="net-4.0"
               family="net"
               version="4.0"
               description="Microsoft .NET Framework 4.0"
               sdkdirectory="${sdkInstallRoot}"
               frameworkdirectory="${path::combine(installRoot, 'v4.0.30319')}"
               frameworkassemblydirectory="${path::combine(installRoot, 'v4.0.30319')}"
               clrversion="4.0.30319"
               clrtype="Desktop"
               vendor="Microsoft"
                    >
          <runtime>
            <probing-paths>
              <directory name="lib/common/2.0" />
              <directory name="lib/common/neutral" />
            </probing-paths>
            <modes>
              <strict>
                <environment>
                  <variable name="COMPLUS_VERSION" value="v4.0.30319" />
                </environment>
              </strict>
            </modes>
          </runtime>
          <reference-assemblies basedir="${path::combine(installRoot, 'v4.0.30319')}">
            <include name="Accessibility.dll" />
            <include name="Microsoft.Build.Conversion.v4.0.dll" />
            <include name="Microsoft.Build.dll" />
            <include name="Microsoft.Build.Engine.dll" />
            <include name="Microsoft.Build.Framework.dll" />
            <include name="Microsoft.Build.Tasks.v4.0.dll" />
            <include name="Microsoft.Build.Utilities.v4.0.dll" />
            <include name="Microsoft.CSharp.dll" />
            <include name="Microsoft.Data.Entity.Build.Tasks.dll" />
            <include name="Microsoft.JScript.dll" />
            <include name="Microsoft.Transactions.Bridge.dll" />
            <include name="Microsoft.Transactions.Bridge.Dtc.dll" />
            <include name="Microsoft.VisualBasic.Activities.Compiler.dll" />
            <include name="Microsoft.VisualBasic.Compatibility.Data.dll" />
            <include name="Microsoft.VisualBasic.Compatibility.dll" />
            <include name="Microsoft.VisualBasic.dll" />
            <include name="Microsoft.VisualC.dll" />
            <include name="Microsoft.VisualC.STLCLR.dll" />
            <include name="mscorlib.dll" />
            <include name="System.Activities.Core.Presentation.dll" />
            <include name="System.Activities.dll" />
            <include name="System.Activities.DurableInstancing.dll" />
            <include name="System.Activities.Presentation.dll" />
            <include name="System.AddIn.Contract" />
            <include name="System.AddIn.dll" />
            <include name="System.ComponentModel.Composition.dll" />
            <include name="System.ComponentModel.DataAnnotations.dll" />
            <include name="System.Configuration.dll" />
            <include name="System.Configuration.Install.dll" />
            <include name="System.Core.dll" />
            <include name="System.Data.DataSetExtensions.dll" />
            <include name="System.Data.dll" />
            <include name="System.Data.Entity.Design.dll" />
            <include name="System.Data.Entity.dll" />
            <include name="System.Data.Linq.dll" />
            <include name="System.Data.OracleClient.dll" />
            <include name="System.Data.Services.Client.dll" />
            <include name="System.Data.Services.Design.dll" />
            <include name="System.Data.Services.dll" />
            <include name="System.Data.SqlXml.dll" />
            <include name="System.Deployment.dll" />
            <include name="System.Design.dll" />
            <include name="System.Device.dll" />
            <include name="System.DirectoryServices.dll" />
            <include name="System.DirectoryServices.Protocols.dll" />
            <include name="System.dll" />
            <include name="System.Drawing.Design.dll" />
            <include name="System.Drawing.dll" />
            <include name="System.Dynamic.dll" />
            <include name="System.EnterpriseServices.dll" />
            <include name="System.EnterpriseServices.Thunk.dll" />
            <include name="System.EnterpriseServices.Wrapper.dll" />
            <include name="System.IdentityModel.dll" />
            <include name="System.IdentityModel.Selectors.dll" />
            <include name="System.IO.Log.dll" />
            <include name="System.Management.dll" />
            <include name="System.Management.Instrumentation.dll" />
            <include name="System.Messaging.dll" />
            <include name="System.Net.dll" />
            <include name="System.Numerics.dll" />
            <include name="System.Runtime.Caching.dll" />
            <include name="System.Runtime.DurableInstancing.dll" />
            <include name="System.Runtime.Remoting.dll" />
            <include name="System.Runtime.Serialization.dll" />
            <include name="System.Runtime.Serialization.Formatters.Soap.dll" />
            <include name="System.Security.dll" />
            <include name="System.ServiceModel.Activation.dll" />
            <include name="System.ServiceModel.Activities.dll" />
            <include name="System.ServiceModel.Channels.dll" />
            <include name="System.ServiceModel.Discovery.dll" />
            <include name="System.ServiceModel.dll" />
            <include name="System.ServiceModel.Routing.dll" />
            <include name="System.ServiceModel.ServiceMoniker40.dll" />
            <include name="System.ServiceModel.WasHosting.dll" />
            <include name="System.ServiceModel.Web.dll" />
            <include name="System.ServiceProcess.dll" />
            <include name="System.Transactions.dll" />
            <include name="System.Web.Abstractions.dll" />
            <include name="System.Web.ApplicationServices.dll" />
            <include name="System.Web.DataVisualization.Design.dll" />
            <include name="System.Web.DataVisualization.dll" />
            <include name="System.Web.dll" />
            <include name="System.Web.DynamicData.Design.dll" />
            <include name="System.Web.DynamicData.dll" />
            <include name="System.Web.Entity.Design.dll" />
            <include name="System.Web.Entity.dll" />
            <include name="System.Web.Extensions.Design.dll" />
            <include name="System.Web.Extensions.dll" />
            <include name="System.Web.Mobile.dll" />
            <include name="System.Web.RegularExpressions.dll" />
            <include name="System.Web.Routing.dll" />
            <include name="System.Web.Services.dll" />
            <include name="System.Windows.Forms.DataVisualization.Design.dll" />
            <include name="System.Windows.Forms.DataVisualization.dll" />
            <include name="System.Windows.Forms.dll" />
            <include name="System.Workflow.Activities.dll" />
            <include name="System.Workflow.ComponentModel.dll" />
            <include name="System.Workflow.Runtime.dll" />
            <include name="System.WorkflowServices.dll" />
            <include name="System.Xaml.dll" />
            <include name="System.Xaml.Hosting.dll" />
            <include name="System.Xml.dll" />
            <include name="System.Xml.Linq.dll" />
          </reference-assemblies>
          <reference-assemblies basedir="${environment::get-folder-path('ProgramFiles')}/Reference Assemblies/Microsoft/Framework/.NETFramework/v4.0">
            <include name="Microsoft.Build.Conversion.v4.0.dll" />
            <include name="Microsoft.Build.dll" />
            <include name="Microsoft.Build.Engine.dll" />
            <include name="Microsoft.Build.Framework.dll" />
            <include name="Microsoft.Build.Tasks.v4.0.dll" />
            <include name="Microsoft.Build.Utilities.v4.0.dll" />
            <include name="Microsoft.CSharp.dll" />
            <include name="Microsoft.JScript.dll" />
            <include name="Microsoft.VisualBasic.Compatibility.Data.dll" />
            <include name="Microsoft.VisualBasic.Comptatibility.dll" />
            <include name="Microsoft.VisualBasic.dll" />
            <include name="Microsoft.VisualC.dll" />
            <include name="Microsoft.VisualC.STLCLR.dll" />
            <include name="mscorlib.dll" />
            <include name="PresentationBuildTasks.dll" />
            <include name="PresentationCore.dll" />
            <include name="PresentationFramework.Aero.dll" />
            <include name="PresentationFramework.Classic.dll" />
            <include name="PresentationFramework.Luna.dll" />
            <include name="PresentationFramework.Royale.dll" />
            <include name="ReachFramework.dll" />
            <include name="System.Activities.Core.Presentation.dll" />
            <include name="System.Activities.dll" />
            <include name="System.Activities.DurableInstancing.dll" />
            <include name="System.Activities.Presentation.dll" />
            <include name="System.AddIn.Contract.dll" />
            <include name="System.AddIn.dll" />
            <include name="System.ComponentModel.Composition.dll" />
            <include name="System.ComponentModel.DataAnnotations.dll" />
            <include name="System.Configuration.dll" />
            <include name="System.Core.dll" />
            <include name="System.Data.DataSetExtension.dll" />
            <include name="System.Data.dll" />
            <include name="System.Data.Entity.Design.dll" />
            <include name="System.Data.Entity.dll" />
            <include name="System.Data.Linq.dll" />
            <include name="System.Data.OracleClient.dll" />
            <include name="System.Data.Services.Client.dll" />
            <include name="System.Data.Services.Design.dll" />
            <include name="System.Data.Services.dll" />
            <include name="System.Data.SqlXml.dll" />
            <include name="System.Deployment.dll" />
            <include name="System.Design.dll" />
            <include name="System.Device.dll" />
            <include name="System.DirectoryServices.AccountManagement.dll" />
            <include name="System.DirectoryServices.dll" />
            <include name="System.DirectoryServices.Protocols.dll" />
            <include name="System.dll" />
            <include name="System.Drawing.Design.dll" />
            <include name="System.Drawing.dll" />
            <include name="System.EnterpriseServices.dll" />
            <include name="System.EnterpriseServices.Thunk.dll" />
            <include name="System.EnterpriseServices.Wrapper.dll" />
            <include name="System.IdentityModel.dll" />
            <include name="System.IdentityModel.Selectors.dll" />
            <include name="System.IO.Log.dll" />
            <include name="System.Management.dll" />
            <include name="System.Management.Instrumentation.dll" />
            <include name="System.Messaging.dll" />
            <include name="System.Net.dll" />
            <include name="System.Numerics.dll" />
            <include name="System.Printing.dll" />
            <include name="System.Runtime.Caching.dll" />
            <include name="System.Runtime.DurableInstancing.dll" />
            <include name="System.Runtime.Remoting.dll" />
            <include name="System.Runtime.Serialization.dll" />
            <include name="System.Runtime.Serialization.Formatters.Soap.dll" />
            <include name="System.Security.dll" />
            <include name="System.ServiceModel.Activation.dll" />
            <include name="System.ServiceModel.Activities.dll" />
            <include name="System.ServiceModel.Channels.dll" />
            <include name="System.ServiceModel.Discovery.dll" />
            <include name="System.ServiceModel.dll" />
            <include name="System.ServiceModel.Routing.dll" />
            <include name="System.ServiceModel.Web.dll" />
            <include name="System.ServiceProcess.dll" />
            <include name="System.Speech.dll" />
            <include name="System.Transactions.dll" />
            <include name="System.Web.Abstractions.dll" />
            <include name="System.Web.ApplicationServices.dll" />
            <include name="System.Web.DataVisualization.Design.dll" />
            <include name="System.Web.DataVisualization.dll" />
            <include name="System.Web.dll" />
            <include name="System.Web.DynamicData.Design.dll" />
            <include name="System.Web.DynamicData.dll" />
            <include name="System.Web.Entity.Design.dll" />
            <include name="System.Web.Entity.dll" />
            <include name="System.Web.Extensions.Design.dll" />
            <include name="System.Web.Extensions.dll" />
            <include name="System.Web.Mobile.dll" />
            <include name="System.Web.RegularExpressions.dll" />
            <include name="System.Web.Routing.dll" />
            <include name="System.Web.Services.dll" />
            <include name="System.Windows.Forms.DataVisualization.Design.dll" />
            <include name="System.Windows.Forms.DataVisualization.dll" />
            <include name="System.Windows.Forms.dll" />
            <include name="System.Windows.Input.Manipulations.dll" />
            <include name="System.Windows.Presentation.dll" />
            <include name="System.Workflow.Activities.dll" />
            <include name="System.Workflow.ComponentModel.dll" />
            <include name="System.Workflow.Runtime.dll" />
            <include name="System.WorkflowServices.dll" />
            <include name="System.Xaml.dll" />
            <include name="System.Xml.dll" />
            <include name="System.Xml.Linq.dll" />
            <include name="WindowsBase.dll" />
          </reference-assemblies>
          <task-assemblies>
            <!-- include MS.NET version-neutral assemblies -->
            <include name="extensions/net/neutral/**/*.dll" />
            <!-- include MS.NET 4.0 specific assemblies -->
            <include name="extensions/net/4.0/**/*.dll" />
            <!-- include MS.NET specific task assembly -->
            <include name="NAnt.MSNetTasks.dll" />
            <!-- include MS.NET specific test assembly -->
            <include name="NAnt.MSNet.Tests.dll" />
            <!-- include .NET 4.0 specific assemblies -->
            <include name="extensions/common/4.0/**/*.dll" />
          </task-assemblies>
          <tool-paths>
            <directory name="${sdkInstallRoot}"
                if="${property::exists('sdkInstallRoot')}" />
            <directory name="${path::combine(installRoot, 'v4.0.30319')}" />
          </tool-paths>
          <project>
            <readregistry
                property="installRoot"
                key="SOFTWARE\Microsoft\.NETFramework\InstallRoot"
                hive="LocalMachine" />
            <locatesdk property="sdkInstallRoot" minwinsdkver="v7.0A" minnetfxver="4.0" maxnetfxver="4.0.99999" failonerror="false" />
            <!--
                        <echo message="sdkInstallRoot=${sdkInstallRoot}" if="${property::exists('sdkInstallRoot')}" />
                        <readregistry
                            property="sdkInstallRoot"
                            key="SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A\WinSDK-NetFx40Tools\InstallationFolder"
                            hive="LocalMachine"
                            failonerror="false" />
                        <readregistry
                            property="sdkInstallRoot"
                            key="SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A\WinSDK-NetFx40Tools-x86\InstallationFolder"
                            hive="LocalMachine"
                            failonerror="false" />
                        -->
          </project>
          <tasks>
            <task name="csc">
              <attribute name="supportsnowarnlist">true</attribute>
              <attribute name="supportswarnaserrorlist">true</attribute>
              <attribute name="supportskeycontainer">true</attribute>
              <attribute name="supportskeyfile">true</attribute>
              <attribute name="supportsdelaysign">true</attribute>
              <attribute name="supportsplatform">true</attribute>
              <attribute name="supportslangversion">true</attribute>
            </task>
            <task name="vbc">
              <attribute name="supportsdocgeneration">true</attribute>
              <attribute name="supportsnostdlib">true</attribute>
              <attribute name="supportsnowarnlist">true</attribute>
              <attribute name="supportskeycontainer">true</attribute>
              <attribute name="supportskeyfile">true</attribute>
              <attribute name="supportsdelaysign">true</attribute>
              <attribute name="supportsplatform">true</attribute>
              <attribute name="supportswarnaserrorlist">true</attribute>
            </task>
            <task name="jsc">
              <attribute name="supportsplatform">true</attribute>
            </task>
            <task name="vjc">
              <attribute name="supportsnowarnlist">true</attribute>
              <attribute name="supportskeycontainer">true</attribute>
              <attribute name="supportskeyfile">true</attribute>
              <attribute name="supportsdelaysign">true</attribute>
            </task>
            <task name="resgen">
              <attribute name="supportsassemblyreferences">true</attribute>
              <attribute name="supportsexternalfilereferences">true</attribute>
            </task>
            <task name="delay-sign">
              <attribute name="exename">sn</attribute>
            </task>
            <task name="license">
              <attribute name="exename">lc</attribute>
              <attribute name="supportsassemblyreferences">true</attribute>
            </task>
          </tasks>
        </framework>
      </platform>
    </frameworks>
    <properties>
      <!-- properties defined here are accessible to all build files -->
      <!-- <property name="foo" value = "bar" readonly="false" /> -->
    </properties>
  </nant>
  <!--
        This section contains the log4net configuration settings.

        By default, no messages will be logged to the log4net logging infrastructure.

        To enable the internal logging, set the threshold attribute on the log4net element
        to "ALL".

        When internal logging is enabled, internal messages will be written to the 
        console.
    -->
  <log4net threshold="OFF">
    <appender name="ConsoleAppender" type="log4net.Appender.ConsoleAppender">
      <layout type="log4net.Layout.PatternLayout">
        <param name="ConversionPattern" value="[%c{2}:%m  - [%x] &lt;%X{auth}&gt;]%n" />
      </layout>
    </appender>
    <appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender">
      <param name="File" value="${APPDATA}\\NAnt\\NAnt.log" />
      <param name="AppendToFile" value="true" />
      <param name="MaxSizeRollBackups" value="2" />
      <param name="MaximumFileSize" value="500KB" />
      <param name="RollingStyle" value="Size" />
      <param name="StaticLogFileName" value="true" />
      <layout type="log4net.Layout.PatternLayout">
        <param name="ConversionPattern" value="[%c{2}:%m  - [%x] &lt;%X{auth}&gt;]%n" />
      </layout>
    </appender>
    <!-- Setup the root category, add the appenders and set the default level -->
    <root>
      <!-- Only log messages with severity ERROR (or higher) -->
      <level value="ERROR" />
      <!-- Log messages to the console -->
      <appender-ref ref="ConsoleAppender" />
      <!-- Uncomment the next line to enable logging messages to the NAnt.log file -->
      <!-- <appender-ref ref="RollingLogFileAppender" /> -->
    </root>
    <!-- Specify the priority for some specific categories -->
    <!--
        <logger name="NAnt.Core.TaskBuilderCollection">
            <level value="DEBUG" />
        </logger>
        <logger name="NAnt">
            <level value="INFO" />
        </logger>
        -->
  </log4net>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
      <probing privatePath="lib" />
    </assemblyBinding>
    <NetFx40_LegacySecurityPolicy enabled="true"/>
  </runtime>
  <startup useLegacyV2RuntimeActivationPolicy="true">
    <!-- .NET Framework 4.0 -->
    <supportedRuntime version="v4.0.30319" />
    <!-- .NET Framework 2.0 -->
    <supportedRuntime version="v2.0.50727" />
    <!-- .NET Framework 1.1 -->
    <supportedRuntime version="v1.1.4322" />
    <!-- .NET Framework 1.0 -->
    <supportedRuntime version="v1.0.3705" />
  </startup>
</configuration>

Wow, thats a lot… i don’t know if they are all required, but better safe than sorry. If you use other framework version, just check the nant.exe.config file because they have all.

Here is an example of a test using an existing build file and NUnit:

[Test]
public void TestNAntBuildFile()
{
    Project proj = new Project("PATH_TO_MY_NANT_PROJECT_FILE.build", Level.Debug, 0);
    proj.Run();

    proj.Execute("TestTarget");

    // Do some asserts
    Assert.AreEqual(.....
}

Here is how you can create the build file on the fly:

[Test]
public void TestCreatedNAntBuildFile()
{
    XmlDocument doc = new XmlDocument();
    doc.LoadXml("<project><target name=\"TestTarget\"><echo message=\"NAnt and Unit tests. A nice couple.\"/></target></project>");

    Project proj = new Project(doc, Level.Debug, 0);
    proj.Run();

    proj.Execute("TestTarget");

    // Do some asserts
    Assert.AreEqual(.....
}

Hope this helps a lot of automation lovers out there! 😀

IT infrastructure just gained a new friend: DPortAck

In the project i’m working now, the architecture of the application became really big along the way.

We have:

  • Around 10 ASP.NET MVC applications that represent the presentation layer of the arquitecture.
  • Around 15 WCF BackEnd applications that exposes the business logic on a SOA Architecture.
  • Around a dozen of Windows Services spread across layer that supports the application as a whole.
  • 10 Databases on the persistence layer.

When we got the machine layout for the production environment we saw:

  • Every layer is physically separated
  • There are firewalls between the machines
  • There are a cluster and a load-balancer on the presentation and on the business layer

Before deploying the applications we wanted to make sure that the machines had all the needed connectivity between them, since we are hiring a Service Provider to host this infrastructure. We don’t control the infrastructure, so we need to rely on their technical support to fix issues with the firewall, network access, machine configuration, etc.

How someone normally test port connectivity between machines? There are a number of ways and all of them normally have the same drawback: They required MANUAL work. TelNet, Port Testers, etc, all requires some manual, tedious process that can easily be forgotten or wrongly executed. I’m a big fan of automation, but the existing solutions out there usually require a lot of installs, agents, configuration, etc.

DPortAck does exactly what we need easily and automatically. And does not even need to be installed. You simply write a simple XML file with the tests you want to make, supply its path as an argument for the DPortAck command line, and thats it, sit and relax (or do something else while executing). DPortAck its implemented with C# and .NET 2.0. So the .NET FX 2.0 its required.

Here is a sample XML file:

<?xml version="1.0" encoding="utf-8" ?>
<Tests xmlns="http://dportack.codeplex.com/">
  <Test name="Sample Test">
    <Machines>
      <Machine host="127.0.0.1">
        <Ports>
          <Port number="80"/>
          <Port number="8080"/>
          <Port number="8081"/>
          <Port number="123"/>
          <Port number="8080" protocol="Udp" socketType="Dgram"/>
        </Ports>
      </Machine>
    </Machines>
  </Test>
</Tests>

Executing this test gave the following output:

Sample Test (MaxTimeout: 5000) – 127.0.0.1 – Tcp.80… ok
Sample Test (MaxTimeout: 5000) – 127.0.0.1 – Tcp.8080… ##### FAIL #####
Sample Test (MaxTimeout: 5000) – 127.0.0.1 – Tcp.8081… ##### FAIL #####
Sample Test (MaxTimeout: 5000) – 127.0.0.1 – Tcp.123… ##### FAIL #####
Sample Test (MaxTimeout: 5000) – 127.0.0.1 – Udp.8080… ok

We just integrated the execution of the Production environment on our build server, and now we receive notifications when something does not work :D, since when a test throws an Error DPortAck sends an exit code of 1.

The HUGE power of SSD’s while developing HUGE projects

A friend of mine (Jose Formiga, check out his blog) sent me the link to this article on the Coding Horror blog and i thought it was a nice opportunity to share the experience that we had while working with SSD’s.

Our development team uses a Dell Latitude E6510 Pentium core I7 4gb RAM with an HDD with 7200 RPM and we are working on a HUGE .NET 4.0 Application. Here are some stats:

  • Over 180 projects
  • Over 6 millions LOC’s (lines of code)
  • 8 Databases
  • Over 5 Gbs of source code

We have everything: ASP.NET MVC Front-Ends, Windows Services, WCF Services, JAVA Tools, you name it.

We also have our build and deploy processes fully automated. The steps are:

  1. Get the latest version from repository
  2. Execute the code-generation tool that we use
  3. Build all the projects that we have
  4. Create the database schema for all the databases
  5. Load the databases with some initial data
  6. Execute the unit tests
  7. Execute UI(Web) functional tests with another tool that we use
  8. Create the Deploy Package

The FULL execution (non-incremental, cleaning all compilation outputs, data, etc) of this build process takes about 25 minutes on the HDD.

When we bought a SSD (OCZ VERTEX 2) the same execution got down to less than 10 minutes.

I mean…. UOW!!! 😀

And not only that… Since most of the time we are using a LOT of programs simultaneously (Visual Studio 2010, Eclipse, SQL Management Studio, Web browsers, etc) we noticed that while on the HDD, even with a lot of free RAM memory, the computer would hang a few seconds from time to time (while changing between files on Visual Studio, creating a new tab on Chrome, etc.). With the SSD the response time of these applications dropped dramatically, and we all noticed a lot of improvement.

This build process its executed on average 5 times a day by every developer before they commit code because we don’t want to rely only on the CI server to detect possible problems but also encourage a more proactive error check. We believe this creates a better mood between the developers and reduce the time of instable code on the repository. With this in mind, here is some math:

WITHOUT SDD -> 5 * 25 = 125 minutes lost on build time by day.

WITH SSD -> 5 * 10 = 50 minutes lost on build time by day.

Again… UOW!!!

I just gained an extra hour a day to actually do something and not wait for the build process to finish.

No more excuses… (at least not that often)

I’m really sure that my Project Manager got quite happy about that.

If i consider that a month has 20 working days average, we could say that for each developer, we would gain around 20 hours a month. We have 20 developers so that makes 400 hours a month as a whole.

We spent around 2000 € to buy the same SSD (OCZ VERTEX 2) to everyone. Some would say that the speed of SSDs are not worth because of the lower life time, reliability, etc.

One developer of ours had his SSD completely burned (not literally, it just went dead and did not wake up :D). And other two that lost the MBR (Master Boot Record) while returning the laptop from hibernate. The hibernate problem can also happens to HDD’s, so we all just disabled it to prevent future loss and now only use the Sleep mode. There is a bunch of forums talking about that, just google it. We also disabled a lot of now unused services like Defrag, Indexing, etc. (Also, there are a lot of forums and articles about this because they lower the life time of an SSD because of unnecessary writes). On the other hand, the complete fail could not be explained and we had to send the SSD back. Thank god it was still on warranty. With this we have lost 3 days of those 3 developers total (1 day each) due to software installation on the new/reinstalled ones. We do have in mind that we have a READ and WRITE super intensive modus operandi and also always try to make commits as frequently as possible to avoid data loss.

We have been using the SSDs for almost 3 months. Again some more math:

3(Months) * 400(hours saved by the development team) = 1200

3(days of software installation because of the failed SSDs) * 8 (work hours per day) = 24

1200 – 24 = 1176 hours saved by the development team in 3 months.

20 (hours per month and per developer wasted on build time without SSD) * 20 (Average developer price per hour) = 400
SSD unit price = 200

Even if we bought two SSDs for developer each month, it would be cheaper for the employer!

Now you can imaginage how happy our Project Manager is :D.

We are not happy yet with the build time, and we are making some optimizations on it.

GO GO GO SSD’s!

EnvRide is now on Mercurial!!!

The EnvRide project hosted on CodePlex previously using the SVN as its source control system is now using Mercurial! 😀

This change was made because the SVN support provided by CodePlex its only an adapter on top of the Team Foundation Server.

There were some really nasty bugs when working with multiple developers, so we decided to change to the Mercurial support, and also to get used to the DSCM approach.

For those wishing to learn more about Mercurial see this tutorial.

Also, if you are a Windows + SVN user and are used to TortoiseSVN, check out TortoiseHG, a Windows shell extension and a series of applications for the Mercurial distributed revision control system.

New phase for the development team and the project itself! Wish us luck 😀