Tuesday, January 29, 2013

Winter 2013 Meeting of OK SPIN


https://plus.google.com/events/cjgstjbf50eive03l3fjt8gbsjs?authkey=CJ7x_7zjgbvOjAE

We 've finally settled on a name, OK SPIN! The Oklahoma Software Professionals Interaction Network. Come join other professionals and discuss the challenges and solutions for producing software.

We hope you can join us for our Winter 2013 meeting.

TOPIC: Tools
If you would like to prepare a 5-10 minute demo of a favorite tool, bring it with you and share with the group.

LOCATION: We are on the OU campus at 3 Partners Place on the main floor. (see the map for details)

TIME: Monday, Feb. 11, 6pm-8pm. 

AGENDA: 6pm-7pm Networking, 7pm-8pm Presentations.

Refreshments will be served, please feel free to bring a meal to eat

SPONSORS: If you would like to sponsor these meetings, please contact Robert at robert@watkins.net

QUESTIONS: If you have questions, contact Robert at robert@watkins.net

Follow the OK SPIN Community on Google+ and share your own ideas!



Google+ Community page for OK SPIN
https://plus.google.com/communities/108846666538991188637

Thursday, January 24, 2013

I finished Cryptography I !

I started this course last year and it ended this week. It was a fun course and I enjoyed it very much. Cryptography II starts in a couple months. Yay!

Wednesday, January 23, 2013

Hold off taking "Software Testing" from Udacity



I was hoping to be able to recommend this course to others, but at this time I cannot. There is so much basic information lacking and and the exercises are vague and not based in the material being presented.

Here are some specific examples.

Equivalent tests. The lecture essentially said, 'equivalent tests are tests that test the same thing.' Something more along the lines of http://en.wikipedia.org/wiki/Equivalence_partitioning would have been appropriate.

Specifications. The lecture basically said, 'you won't often get specs, so you will have to build them'. That's a correct statement, but there is quite a bit to do to actually perform this successfully. Talking to people (other than the developer) is conspicuously absent from the discussion. There is a non-developer that often has an expectation of how something should work. A discussion of 'test oracles' would have been appropriate (see http://www.softwarequalitymethods.com/Papers/OracleTax.pdf for an excellent read)

GUI testing. The lecture basically said,'try to randomly fire gui events'. Where is the discussion of modeling use cases as the start of test cases? Where is an attempt to understand the input/output of individual gui elements?

It's not so much that the content is wrong, but that there is no acknowledgement of the wide variety of approaches that are available to approach testing. I get the feeling that the authors of this course have spent the majority of their career in code and not in a manual testing role.

Anyone else have experience with this class?

Thursday, January 17, 2013

Answering Loaded Questions

As a human, I am slow to evolve my behavior. For example, I was recently asked. "What is slowing the pace of development for the new regressions so much?"

In the past, I would have had a snarky response, or just gotten upset. I still got upset, but I didn't speak up at first. I gave it some thought.

Here's what I came up with. I found an article "Indirect Responses to Loaded Questions" from the 1960s. http://www.aclweb.org/anthology-new/T/T78/T78-1029.pdf . It describes how a natural language processor could handle such situations.

Here's the gist:

  • Identify the assumptions in the question
  • Determine which are valid, invalid or unknown
  • Respond with at least one of the following types (called 'Corrective Indirect Responses')
    • Answer a more general question
    • Offer the answer to a related question
    • Correct the assumption presumed to be true and indicate that it is either false or unknown.
So, in my case, the assumptions and their validity are:
  • I would know about the pace of developing new regression tests. (this is true)
  • The pace of developing the regression tests is slowing. (this is only true in a very narrow view of the situation)
  • There is an expected pace for the work. (this is not true)
I'm still thinking about my response. Here is a sample of what I've come up with.
  • "As viewed through the project dashboard, the pace of the project has been pretty consistent, even if a bit slower than desired." 
  • "Before I was asked to coordinate this work, the project had no measurable progress for over a year. It now has measurable progress."
  • "When we discussed applying normal project planning methods to this work, I was told that it was too much effort. If we had done that, we could answer this question clearly. We have not asked for any commitments of time for all the team members involved. We have not asked for estimates of the effort to complete the work. Both of these would help manage the progress of this work."
What would you do?


UPDATE
I thought about this some more. Here are some additional responses, focused on providing solutions, rather than defending or explaining the current state.

  • "We can certainly discuss the pace of work. That will require agreeing to specific commitments with those doing the work."
  • "Let's discuss in detail specific estimates for the remaining work and available time with those doing the work so that we can get a better understanding of what the end date could be."


Tuesday, January 08, 2013

Udacity

Have you or anyone you know taken the software testing course from Udacity?

http://www.udacity.com/overview/Course/cs258/CourseRev/1

I think I'll give it a spin once my cryptography course is done.

Monday, January 07, 2013

Janitor Monkey

When I first heard about "Chaos Monkey" from Netflix, I was hooked. It's only job is to randomly kill processes to ensure that error handling and recovery works well.

When I heard from Slashdot that "Janitor Monkey" was open-sourced, I again jumped for joy.

From the article:

"Janitor Monkey is part of the so-called Simian Army of at least eight internal management tools, including Latency Monkey, which introduces artificial delays into the system, and Chaos Monkey. Some (but not all) of these tools have been open-sourced."
slashdot (http://s.tt/1y3BK)

How many systems that you work on are able to stand up to this kind of self-administered abuse? :)