Tuesday, October 16, 2012

Did you ever wonder...?

Did you ever wonder what happens when a program crashes and Windows asks if you would like to report the problem?

Here is a report from a few years ago that talks about that process. Chapter 6 is an interesting look at how effective their 'automatic bucketing' of bugs is.

http://research.microsoft.com/pubs/81176/sosp153-glerum-web.pdf

Enjoy!

Saturday, October 06, 2012

Essential Software Test Design


I did a review of this book previously and found it to be the best book on test design I've ever read.

I found out today that you can get a PDF version of it for free!

What are you waiting for? Get it now!

Friday, August 17, 2012

Stop Testing! (but know when)

On a rare occasion, a feature that I'm scheduled to test is very different than what I expected. Generally, this happens when either:

  1. The feature was not documented fully because 'the team doesn't do that sort of thing'

    OR
  2. The feature was not documented because 'it is so minor, it's not worth documenting'

    OR
  3. The feature was implemented by someone that needs (ahem) oversight.

In any case, I have found myself in each of these conditions and am faced with a challenge. What to do?  
  1. First, Stop Running Your Planned Tests. Really, you aren't going to get anywhere.
  2. Start exploring what was implemented. You may find that it's your misunderstanding and not the developers.
  3. Keep good notes.
  4. Raise a Red Flag. Either you need to re-write your tests or the software needs to be re-written. Either way, this is going to affect the schedule.
  5. Re-test once this is cleared up.
All too often, I found that I have misunderstood the documentation, so taking the time to dig in will save lots of grief with false bug reports. You will also learn a bit about how you can make mistakes and can avoid them in the future.

Good Luck!


Tuesday, June 19, 2012

Listen to the Software Test Podcast!

I was fortunate to be able to talk to Michael and Emily for their podcast. I had a good time and I hope you enjoy listening!

http://blog.softwaretestpodcast.com/2012/06/19/episode-30.aspx

Friday, June 08, 2012

New Software Professionals Group in OKC!




Would you like to meet other Software Professionals?
(even if you aren't a developer)

Would you like to get tips on how to manage projects?
(even if you have no formal project manager)

Would you like to know how much testing is 'enough'?
(even if you have no dedicated QA staff)

Would you like to hear how to manage requirements?
 (even if you don't have any Business Analysts) 

If you are a BA, QA, PM, Developer or anyone that produces software, this group is for you.


Our first meeting will be Thursday, July 26th at 6pm.


Please forward this to anyone you think would be interested.

Thursday, May 24, 2012

Multiple Openings Near Tulsa

Here are several openings available. If you know of anyone interested, please forward this.


-----------------------------------------------------------


Role Requested: Application Testing Analyst
Location: Bartlesville, Oklahoma
Number of Resources Requested: 3
Role Begin Date: June 1, 2012
Role End Date: December 31, 2012
*Please note end date is estimated

Role Description:
Responsibilities The Application Testing Analyst may be responsible for:
- Converting and developing automated application testing script scenarios using HP's Quality Center (QC), HP's Quick Test Professional (QTP) and SAP's Test Acceleration and Optimization (TAO) software tools.
- Reviewing requirements, test scenarios and working with business functional analysts to develop and validate automated testing scripts.
- Interfacing with Business Unit users and testers worldwide.
- Providing support and assisting in troubleshooting for automated test scripts.
- Adherence to corporate guidelines, standards, and best practices in the use and deployment of cross-application solutions.

Qualifications
- Bachelor's Degree in MIS, computer science, or information technology; or 5 years of direct experience in the information technology field.
- 2 or more years of direct experience with software/application support and/or development.
- 1+ years experience with HP Quality Center
- 2+ years experience developing automated tests with SAP TAO2 Experience
- 2+ years experience developing automated tests with HP Quick Test Pro
- 1+ year SAP application testing experience
- Demonstrated analytical and problem solving ability.
- Demonstrated ability to multi-task in a team environment.

If you are interested in this position please email a updated copy of your resume to JCamarillo@Escendent.com along with your salary requirements and a brief description of your most recent experience as it pertains to the position above.

Joseph Camarillo
IT Recruiter
Escendent, LLC
(Office)             312-445-4500    
(Cell)              832-256-0719    
www.escendent.com

Monday, May 14, 2012

Why using requirements as the only source for testing is a bad idea...

James Bach pointed this out in a recent interview on the Software Test Podcast .  The video is called 'Shouting in the Data Center' and it shows the effect that shouting has on disk performance. (Really!)

http://www.youtube.com/watch?v=tDacjrSCeq4

Monday, May 07, 2012

Influence


My boss handed me a book and told me. "Here is a gift for you. Let me know when you are done, so I can borrow it." He has recommended other books before (both work-related and otherwise) and hasn't steered me wrong.

In short, this is an excellent book for anyone that wants to avoid traps of persuasion and understand how others can be effectively persuaded. The author of the book takes his training as a PhD in Psychology as well as some experiences he's gained in preparation for this topic and puts together a wonderful story of the science and practice of influence.

In the book, you'll learn:
  • Why social obligations can seem so burdensome (and why we participate in this activity anyway)
  • How we find ourselves agreeing to do something we have told ourselves we wouldn't do.
  • Why 'the crowd' is so powerful in influencing what we do (and do not) do.
  • How many times we are influenced and not even aware of the influence.
It's an excellent read full of fun stories and deep insight. I hope you'll take the time to read the book as it will help in nearly every aspect of your life. :)

Sunday, April 29, 2012

The Beginner's Mind

"In the beginner's mind there are many possibilities, in the expert's mind there are few." - Shunryu Suzuki

For a long time, I refused to track test requirements at any level of detail. The rationale was that it took too long and had limited value. I was able to convince many people that I was right and that this wasn't needed. I 'knew' that this wasn't useful, so I didn't do it.

What really happened was that I had tried a couple times to track requirements at a very (very, very) granular level and gave up when the maintenance of these requirements was too much. I made the mistake of believing that my was was the only way and that since my was was not workable, no was was workable.

I've spent too much of my career 'knowing' what to do. This is not to say that I've come across as a know-it-all or not being cooperative when other (and better) ideas come along. I mean to say that I've not taken the opportunity to seek other opinions, especially from those with a 'beginner's mind'.

The 'beginner's mind' is not only found in beginners, but those that allow the many possibilities to be given full consideration, even when our experience tells us otherwise. We use our experience to help guide us in making better decisions with less work. However, these shortcuts can get in our way, especially when the basis for some of these shortcuts no longer exists.

Since these early days, I've been able to open my eyes to the possibilities and come up with a method of tracking test requirements that is actually maintainable. (go figure)

Are there things you 'know' to be true about testing software?
Have you examined those 'truths' to ensure their basis is still valid?

Examine your beliefs, see what you find.

Monday, April 09, 2012

http://www.summerqamp.org/

If you know of any company that is looking to hire summer QA interns, this program looks like a good opportunity to get support for that and get connected with people interested in getting into the field.

If you are or know of someone interested in QA, check out the link!

http://summerqamp.org/


Tuesday, April 03, 2012

Auntie Cueway

From time to time, people send me things to further my education, to bring a laugh to my day or to generally improve my moral standing. I've started getting these letters anonymously that seem to be from a 'Auntie Cueway' to what appears to be someone she is mentoring. She speaks as if she is mentoring a nephew, but I'm not entirely sure if this is accurate or not.

Rather than give you my impressions of these letters, I will reproduce them here and let you decide.

P.S. Apologies to C.S. Lewis. Any resemblance to 'The Screwtape Letters' may be intentional...

-------------------------------------------
TO: gunther@corp.com
FROM: auntie@corp.com
SUBJECT: Congratulations!

My Dearest Gunther,

Congratulations on your new role as team lead! It took quite a bit of patience and effort on my part to get the director to believe it was his idea to put you in charge of the team after your previous manager's 'accident'.  It's now your turn to put in the work we've been planning. Even you should be able to understand how important the kickbacks I get for our current test tool licenses are, so be on your toes.

Signed,
Your loving Auntie Cueway



-------------------------------------------
TO: gunther@corp.com
FROM: auntie@corp.com
SUBJECT: Metrics

Gunther,

I was looking over the proposed Metrics Standards document you drafted and am very disappointed. It's all very well to require five pages of metrics each week for each team, but you must be more careful in picking your metrics.

Since I know you will be unable to figure this out on your own, let me be blunt. Take out the Defect Detection Percentage and the whole section on test requirement coverage. These are far too helpful for the director and the other metrics are seemingly helpful enough that he won't complain about the content. If he does, you need to be sure to speak to him as if he were a fool and suggest that if he doesn't understand these metrics, maybe he's not suitable for his position. That may be a bit risky, but if you use the right nuance, then he will internalize the comment as if he had come to that conclusion himself.

Affectionately,
Your deovted Auntie Cueway

-------------------------------------------

TO: gunther@corp.com
FROM: auntie@corp.com
SUBJECT: Automation

My Darling Gunther,

From what I hear from the developers, your test automation strategy is coming along nicely. The tests are just unstable enough to often work, but not so unstable to warrant a re-write. They still require lots of manual intervention to start and then to review results. My favorite part is that you were able to convince the director that the work needed to automatically put the results into the test management system are so onerous, that it's not really worth the effort. If my calculations are correct, the automation is twice as time consuming as if you were running the tests manually. This is quite an achievement and will ensure that our test tool licenses will continue to be funded (along with my kickbacks).

But I do have a word of warning, the director is not so simple-minded as to be fooled for long. You must be able to quickly show improvements in the automation so that he doesn't bring in a consultant or hire someone that can see through your ruse.

Always your servant,
Auntie Cueway


-------------------------------------------
TO: gunther@corp.com
FROM: auntie@corp.com
SUBJECT: Not your best work

My little Gunther,

I'm not sure why you have disregarded my previous warnings about the director. I'm hearing rumors that he's beginning to reconsider your appointment. You realize that if you fail me, these lucrative kickbacks will stop and you will be out of a job!

I have always provided you with a good position and have led the way for you to rise within Corp's ranks while being able to keep my own involvement a secret. This is how you repay me? Your father tried to warn me when I wanted to hire you. He said you were not up to the task, but I stood by you. My only hope now is that you can show enough improvements in your team to head off a disaster.

It's clear that you are unable to think of these things yourself, so here is what you need to do. You need to push your team to write and run as many tests as they can, regardless of their value. Then, you need to write as many bug reports as possible. Take that notebook you've been keeping on system quirks and write up a half dozen bug reports for each of these. Now pay attention, this next part is critical. For each bug, you must follow your bug reporting guidelines perfectly. There should be no question that each bug report is well-written and represents a true quirk. You should give each version of these bugs a wide range of priorities so not to draw attention to what you are actually doing. Hopefully, with this explosion of new tests and defect reports, the Director will ease his suspicions.

Your Loving Auntie Cueway



-------------------------------------------

NOTE: These are only the first few letters and I've not been able to read further. As the remaining letters are organized, I will post them for you to read.





Thursday, March 29, 2012

QA Analyst/Tester Position in OKC

This came across my email box. Even if you are not interested, please forward this on.


Greetings!

Our prominent Orange County client is aggressively seeking a QA Analyst/Tester with Automation Experience for 6 + month Contract in Oklahoma City, OK! Our client is a leading independent broker and agency writer of automobile insurance in California and has been one of the fastest growing automobile insurers in the nation. It is ranked as the third largest private passenger automobile insurer in California, with total assets over $3 billion. Please note that, our client cannot sponsor H-1 visas and will ONLY consider candidates with excellent communication.

Below is the description for the position. Please review it and if you are interested, please contact Susan Ashman (susan.ashman@zebra-net.com) at             949/910-9800       or             949/900-6110       or Michelle Oelhafen (michelle.oelhafen@zebra-net.com) at             949/900-6110      ; We will be happy to tell you who our client is and begin the interview process for you. Thank you for your consideration!

Position Description

Our client is looking for QTP/Automation QA candidates with experience in the following:
·                  QuickTest Pro
·                  Selenium
·                  Agent facing or customer facing web application testing in Quoting, New Business, UW
·                  HP ServiceTest or soapUI Web Services Tester

The primary responsibility of the Test Engineer is to work with the development and quality assurance teams to automate testing, create new test harnesses, implement new automation tools and create innovative automated test systems.  Assists in testing modifications to existing software to fit specialized needs and configurations, and maintains program libraries and technical documentation of software defects. Also, under general supervision, the Test Engineer develops tests, debugs and is part of the IT team that implements operating systems components, software tools, and utilities required for the operation, maintenance, and control of internally developed and commercial off the Shelf (COTS) software.

Duties and Responsibilities

·         Design, develop, modify, and execute automated scripts for GUI, regression, load and performance testing using HP tools (QTP, Service Test and LoadRunner)
·         Document, troubleshoot and isolate problems encountered during testing
·         Interprets test automation results
·         Prepare management reports documenting test execution results
·         Participate in defining automation test strategy, structure and methodology
·         In conjunction with Business and Quality Assurance Analysts, responsible for analyzing use cases and test scenarios in order to make the appropriate determination whether the functionality is an ideal candidate for automation
·         Assist with the research and development of test automation best practices
·         Developing performance test scripts in LoadRunner using VBA, HTTP, Web (Click and Script) and Citrix protocols
·         Setup monitoring and execute performance testing on a wide range of client/server and web based applications
·         Analyze performance test results and identify performance bottlenecks
·         Make recommendations for performance improvement
·         Track the performance related defects/issues and enhancements
·         Assume other responsibilities as directed

Knowledge, Skills, and Abilities

·         BS in Computer Science or equivalent technical training and professional work experience
·         A minimum of 5 years cumulative experience in software development and/or testing
·         Extensive background in QA methodologies and experience developing and executing comprehensive test suites
·         Strong analytical and troubleshooting skills, as well as excellent written and verbal communication skills. Able to communicate professionally at all levels within and outside of the organization
·         Able to work independently and be a team player
·         Ability to provide guidance and mentor team members
·         Handle multiple projects simultaneously and meet deadlines while effectively managing priorities and communicating progress
·         Strong working knowledge of MS applications (Excel, Word, Access)
·         5+ years of experience with SQL and working knowledge of relational database technologies
·         Strong knowledge of the Software Development Life Cycle
·         Possesses strong technical and programming/test automation skills as demonstrated by 3+ years developing and maintaining complex test automation and automation infrastructure using industry standard products or languages such as C++, Visual Basic or other scripting languages
·         Experience in developing data driven Tests Scripts that leverage Database checkpoints, User defined functions, Database connections and batch execution
·         3+ years working on performance testing initiatives that involve test design, script development and execution, s monitoring and results reporting
·         Understanding of infrastructure design and server technologies
·         Experience developing performance test scripts using VBA, HTTP, Web (Click and Script) and/or Citrix protocols

Education and/or Experience

Worker characteristics are normally acquired through the successful completion of 4–year college degree in a computer related field and 5+ years of experience as a Development Engineer, Test Engineer or Quality Assurance Analyst. Candidate possesses sound technical and programming/test automation skill as demonstrated by 5+ years developing and maintaining complex test automation and automation infrastructure using industry standard products or languages such as C++, Visual Basic or other scripting languages. Experience in developing data driven Tests, Database checkpoints, User defined functions, Database connections from QTP, scheduling Scripts for execution.



______________________________________________________________
Michelle Oelhafen
Marketing Assistant | Zebra-net, Inc.
9995 Muirlands Blvd. | Irvine | CA | 92618
            949-900-6110       |             949-900-6118       – FAX
Women’s Business Enterprise Council (WBEC) Certified
We Promote Growth for People and Companies!