Day 5 of the Exoftware Agile Course at UCD Dublin

If you’re very observant , you might notice that this post is about a week late. Still , here are my notes on the slides / nodes from the final day of the Agile course.

Previous Posts from the training course are:

Exoftware Logo

Junit Introduction

  • Junit
    Framework for writing automated unit tests (don’t need to use Junit
    to write the tests , but it helps)

  • All Junit
    tests extend TestCase

  • Individual
    test methods (pre Junit 4.0) : public void testSomething() method
    signature

  • Use
    assertEquals assertTrue assertFalse assertNull assertNotNull
    assertSame and fail methods for testing.

  • SetUp() and
    tearDown() methods called before and after each test method.

  • TestSuites
    (groups of TestCases) replaced by Ant and (Eclipse) IDE
    functionality

  • Run tests via
    built in Text / Swing test running , or more likely use IDE / Ant
    integration.

  • Organise via
    same or paralell folders

  • TestMyClass
    or MyClassTest

Mock and Stubs

  • Problem: some
    classes (Collaborating Objects) can be tightly bound to system
    resources (e.g. File or Database) or to a hard to test API. Another
    example is the observer pattern. How do we unit test these?

  • Solution: use
    fake object , pass it to class under test , allows testing of the
    class at a unit level.

  • Stub : fake
    object that uses hard-coded data , often following an API. Many
    ready made stubs available (e.g. For the JDBC libraries).

    • Replace
      expensive to create objects

    • Have to
      create all objects that production code interacts with.

    • Start in
      middle and develop outwards

    • One failure
      can ripple out / appear elsewhere.

  • Crash Test
    Dummy: type of stub that deliberately fails in order to test
    exception handling – simulate database crash or I/O full.

  • Self Shunt is
    where the unit test itself implements the interface (doesn’t work
    for classes) and gets callbacks from the class under test.

  • Mock Objects
    like stubs, except that they are intelligent enought to self-verify.

    • Mock
      Secondary objects (normally ones that we build) instead of API (as
      Stub does)

    • Outside in
      development style

    • What if mock
      implementation is incorrect?

  • Interaction vs. State: In
    state-based testing you check the tests by examining the state after
    the stimulus. In interaction based testing you check the right
    interactions were triggered by the stimulus – Martijn
    Fowler,
    http://martinfowler.com/articles/mocksArentStubs.html

  • Mock objects
    tend to test more the interaction between objects rather than a unit
    test of the object itself.

  • Stub rather
    than mock external API’s

Smells and refactoring.

  • Code smells
    are bad or suspect design decisions in code.

  • Not always
    bad , but should be looked at (often will involve a trade off)

  • Smells are
    often hard to understand , hard to change code.

    • Samples :
      duplication / long methods / poor naming / tightly coupled classes
      / switch statements /

  • Refactoring :
    Small , controlled changes to codebase , so that it continues to
    compile / tests run to improve the design. The behaviour remains
    exactly the same.

  • Seperate
    refactoring and adding additional functionality.

  • Easier with
    automated tools and Unit tests to confirm that nothing has broken.

  • Refactoring
    != Rewrite. Codebase is evolved , not thrown away.

User Stories

  • In some ways
    , equivalent to Use Case from Predictive methods (but with subtle
    differences).

  • Represent
    chunk of functionality that makes sense to the customer.

  • Can be
    represented as card / conversation / paper based documents (more
    predictive approach).

  • In Agile (as
    opposed to predictive) , shift the focus from writing to talking
    i.e. Get a true understanding of what the customer wants , even if
    they haven’t expressed it very well.

  • Lessen the
    importance of requirements

  • Support
    iterative development and participatory design.

  • Even a simple
    requirement has many possible permutations (that lessen the odds of
    getting it right.

  • Not just a
    generic user – but more user roles – groups of users that do
    different things with the system , depending on their experience ,
    task at hand etc

  • User Stories
    should be Invest:

    • Independent

    • Negotiable

    • Valuable

    • Estimatable

    • Small

    • Testable

  • Templace
    (FlashCard , as used by Xplanner)

  • Be careful of
    size

    • too big and
      cannot be used for estimated (break these down into smaller
      stories). Easy for Compound , but for complex stories might need to
      do some research and then break out.

    • Too small ,
      and it is not worth the admin of estimating each. Bundle them up
      into higher level task like UI improvements or bug fixes.

  • User Story
    should give us the acceptance tests

What is Web 2.0?

I’ve often been asked the question ‘What is Web 2.0’? Normally it’s followed quickly by the question ‘how do I make money out of it?’ Recently I’ve been thinking that Web 1.5 might be a better term (as it is an upgrade to the web , with a mix of shiny new and old but reliable techniques).

Web Monkey Logo

While there are many sites jumping on the Bandwagon and claiming the Web 2.0 Title, Tim Ziegler writing on Webmonkey gives a very good summary of what most Web 2.0 sites have in common. As always, you’ll be able to find true Web 2.0 companies that break these ‘rules’ , but it’s as good a place to start as any.

In Summary, Web 2.0 Sites / Companies / Products tend to:

  • Build on the notion of ‘the long tail’ where niche demand meets niche supply , a cost-effective and profitable market due to lower transaction / search costs on the Web.
  • Web as a platform. It doesn’t matter where you are , or what computing you use. As long as you have a web browser you can use these products.
  • Ajax , a technique that combines the power to traditional Desktop Applications (like Word and Excel) , with the ‘use anywhere-ness’ of web pages.
  • Smart Content Management. Create and publish a web page as easily as a Word Document. Forget needing to know FTP , HTML , CSS and other low level tools of a previous generation.
  • Dashboard Views. Because Web pages are published in Machine as well as Human Readable format, it is easy to create summary, Dashboard views.
  • Give it away to get more back
  • . It is alleged that in earlier days Microsoft was more willing to tolerate piracy of it’s office suite in order to get ‘critical mass’ – missing out on revenue initially, but growing the cake substantially in the process by becoming the de-facto standard. It may have worked for them but ‘give your product away to get a core of paying users’ won’t impress too many VC’s.

  • Human Filters or Trust your users. Also known as ‘Many hands make light work’.
  • Iterations , or many small releases (easy because the software runs in only one place) is better than one big bang.
  • Simple is good. Forget feature overload – think iPod.

So that answers the ‘What is Web 2.0’ Question. Up to you to find out how to make money out of it.

More Web 2.0 Posts here.

Day 4 – Summary of the Agile Course so far (Part 1)

Below are my notes from the Agile course so far – covering introduction to Agile projects , why it makes business sense, the use of FIT and other acceptance testing tools what Agile means for customers , managers and teams.

These notes are a good complement to the Agile presentation I gave.

Introduction to Agile

  • Requirements
    Change on project – it’s going to happen , clients (paying for the code) have the right to change their mind , accept it and get on with it.

  • Agile Family Includes

    • XP (Extreme Programming)

    • Scrum

    • Adaptive Software
      Development

    • Lean Software
      development

    • Feature Driven
      Development

    • Crystal

    • DSDM

  • Values
    wrap Principles wrap Practices

  • Values
    (from the Agile manifesto)

    • Individuals and interactions over processes and tools.

    • Working software over comprehensive documentation.

    • Customer collaboration over contract negotiation.

    • Responding to change over following a plan.

  • Principles
    (Customer Focused)

  • Deliver early, deliver often.

  • Deliver valuable software.

  • Welcome change.

  • One
    Big team (business people and developers)

  • Principles (for Management)

    • motivated individuals give better results

    • face 2 face is the best way to communicate (although accept need to
      communicate across location and across time as good 2nd
      best).

    • Remember : working software that meets business needs is the primary measure
      of progress

    • Promote sustainable development (or you’re just going into Technical Debt)

  • Principles(for Teams)

    • use technical excellence and good design to promote agility but

    • simplicity is essential

    • the best architectures, requirements, and designs emerge from
      self-organizing teams (rather than imposed from above)

    • do continual improvements to the process

  • Practices

    • Continual Improvement

    • Technical Excellence

    • ProjectManagement

    • Business Practices

    • Community and Collaboration

  • Why Agile

    • Waterfall projects : we’re most likely to find bugs in testing phase at the
      end , when (a) we’re most likely to be under time pressure and (b)
      when they’re most expensive to fix.

    • Waterfall projects: Hard to know how far through we are , as each step is
      different from the last (design has gone well , but will coding do
      better or worse?)

    • Agile:Testing is continuous, so likely to get early feedback and
      resolution of bugs. Testing = Unit Tests and Acceptance Tests.

    • Agile: Progress is steady and measurable , as we carry out all
      tasks as part of a regular cycle.

    • Productity = doing less but producing more value. e.g. Customer can prioritize
      highest value tasks and we deliver these first.

    • Standish group stats showing small projects are more successful.

    • Deploy early and often gets usable software into the hands of users sooner
      where it can start to payback it'[s investment earlier.

    • Success factors for Agile projects:

  • Enthusiastic development team/ Committed customer / Knowledgeable on-site QA
    resource / Didn’t cherry pick practices / Open workspace / Regular
    retrospectives.

  • Still hard work to do:

    • Good people give good software

    • Can be faked , so people really need to buy in

    • Practices are learnable / a lot are common sense

    • Recognize reality

Informative
Workspace

  • Person walking into ‘normal’ coding office has no idea of what is going on, nor what status it is at. Most of the stuff is in people’s heads.

  • Informative workspace seeks to communicate this information (via simple methods)in the office e.g. Notice boards, coder of the week hat , printed graphs of velocity , iterations etc.

  • Kent
    Beck :
    "An interested observer should be able to walk into the team space and get a general idea of how the project is going in 15 seconds. "

  • Similarto Kanban idea – while going for Just in Time / continual improvement , visual representations are displayed to demonstrate
    progress.

  • Information Radiators : anything that gives out info (example given of monitor running , showing latest build progress).- jokey examples of Lava
    Lamps / notes in toilet

  • Toyota (Toyota production system – recognise their edge is in how they build their cars, and not just the cars that they build).

    • Human face to automation.

    • Board giving current
      production status

    • Audible Cues

    • Ability for anybody to stop the line.

Testing and Design by Contract

  • Started by Bertrand Meyer in the Eifel Language

  • Frameworks available to add these (fully) to the Java Language – over and above the functionality available in Java 1.4 (the assert keyword)

  • Already have a contract that is enforced by the compiler in Java (e.g. Method signatures , return types)

  • Adds to this , to be a finer grained level .e.g method should return not only an Integer , but an Integer of value 0 to 100

  • Like every contract both sides (Caller and Callee) have responsibilities

  • Pre Conditions : Condition that must be true when the method is called.

  • PostConditions: Condition that the method guarantees when it returns

  • Class Invariants state what will be true at a class level (once all pre-conditions at method level / constructor level satisified)

  • Both pre and post provide additional specifiication (.ie. It is clearer to people using the code what is required / what it does). As such
    provides clearer design and additional documentation.

  • Can make code more succient by replacing ‘boilerplate’ code with standard check (e.g. Precondition param must not be null).

  • Java
    Example :

    • Pre
      Condition /**
      @require !empty() */ – as
      javadoc on method (multiple checks allowed) –

    • Post
      Condition : /** @ensure !full()*/

    • Class Level invarient: /** @invariant 0 <= count()*/

  • Exceptions

    • Pre Condition violation should give RuntimeException – provide method to test before call (e.g. IsEmpty() before doing remove)

    • Post conditions: supplier should be every effort to fulfill these : so if it can’t , it is normally due to an Exception (callee should be prepared for this failure)

  • Junit

    • Codein Junit verifies the contact of the class (and provides documented example of code use)

    • assert methods = post condition testing

    • test invarient by checks at class and not just method level (e.g. Check overall status of class, not just return value of method under
      test)

    • Pre conditions by be tested by violating these in setup (code before call to method/ class under test) and ensuring exception is thrown.

  • Think about Test as a contract.

Acceptance Testing

  • Acceptance Tests – defined (or agreed by customer) , states thatfunctionality is complete

  • Differ from unit tests in that they are system wide rather than at class/unit level.

    • Sometimes people used high level / coarse level Junit tests.

  • Unlike unit tests don’t need to be a 100% all the time (but once passed , should alway pass)

  • Automation is nice (for developer and customer)

    • Own note: Not everything in acceptance testing is automatable. e.g. UI testing.In this case automating the acceptance testing of the calculations (the 80%) means that the UI testing is a lot easier.

  • Sample Frameworks for Automated acceptance testing

    • FIT – see notes below

    • Exactor – see notes below

    • Fitnesse – FIT but wiki based (easier to do multiple tests , easier to communicate these to a team.

    • WinFIT runner

  • Inputto all these tools is close to a spec (close in that typicaluser , after one walkthrough should be able to understand what the documents mean).

  • Tests can be written before application is build (again, the notion of the tests forming a spec). A lot of the commerical tools ,
    especialy those that record against a ‘live’application, do not allow this.

Automated Acceptance Testing with FIT

  • Allows Customer s to review (if not write) the acceptance tests.

  • Previously would have done these tests manually , or via large Junit tests (i.e. Use Junit not as Unit testing , but a way of describing the entire app from the user point of view).

  • Data (unlike in Exactor) is specified as a HTML table. All non-tables are ignored (so safe to created tests documents using MS Word).

  • Fixture (connects this HTML Data) to run test against ‘production’ code.Fixtures are written by programmers (as part of proving code meets
    the spec). Three types of fixtures.

    • Column

      • One class for each table

      • Fit works through on a line by line basis – good for repeating the same action over and over again.

      • 1st Line of Table : Class Name to be tested

      • 2nd Line of Table : class variable or methodName() to be set / called

      • 3rd
        Xth lines of table: values to set or expected values to be returned

      • Good for testing Domain (Business) Logic parts of an application

    • Action

      • mulitiple class for
        each table

      • start – the class to be tested

      • enter – value to be passed into method (no return value)

      • press – call method (no input or output)

      • check – check value on method (no input)

      • Good for stimulating interaction with the user interface (or more accurately either the controller that sits just behind the user interface).

    • Row

      • Like Column , but interprets all at once (rather than one at a time)

      • extend RowFixture ,
        provide query[] and getTargetClass() methods – allows Fit to see what data structure

      • good for testing coding in the service / persistence / db layer of code.

Automated
Acceptance Testing with Exactor

  • Exactor (http://exactor.sourceforge.net)

  • Allows Customer s to review (if not write) the acceptance tests.

  • Previously would have done these tests manually , or via large Junit tests (i.e. Use Junit not as Unit testing , but a way of describing the entire app from the userpoint of view).

  • Comprises (like fit) of

    • Script – a text file thatyou could show the customer.

    • Commands : what the script calls , extends Exactor classes. These commands then call the code beingtested.

    • Command extends Junit assert –
      so usual capabilities available.

    • Command provides access to test wide map to allow storing / exchange of values.

    • Allows composites to reuse scripts.

    • Scripts allow placeholder parameters

Day 2 of the Agile Course at UCD

Day 2 of the Agile course (provided by Exoftware) at UCD. Most of yesterday (day 1 that I missed), was good , but standard introduction , including:

  • Differences between RUP (Rational Unified Process), Extreme Programming (XP) and Agile approaches.
  • The XP / Agile Planning Game.
  • Problems with the existing software development process.

Today (Day 2) is a lot more ‘hands on’ in that it’s straight into the computer labs to work through the example that we’ll be using for the rest of the week. Two of the more interesting items that we’ve covered today (apart from working through the ‘Build a system to play blackjack’ example are:

  • Comparing Test Driven development to Design by contract. In this comparison, the code to setup your unit tests are equivalent to your pre-conditions, and the assertions within the JUnit tests are the post conditions.
  • Separation of concerns on get / set: each should do one thing , and one thing only

The worked examples from today (Mock Objects and Video Store) are also very well thought through and get the point across.