Brian Kurtz and I recently traveled to Grand Rapids, Michigan to attend CAST 2015, a testing conference put on by AST and other members of the Context-Driven Testing (CDT) community. I was rewarded in a myriad of ways such as new ideas, enhanced learning sessions, fresh models, etc, but the most rewarding experience from the conference lies in the people and connections made. The entire CDT community currently lives on Twitter, so if you are new to testing or not involved in social media, I would recommend that you begin there. If you are looking for a starting point, check out my Twitter page here, Connor Roberts – Twitter, and look at the people I am following to get a good idea of who some of the active thought leaders are in testing. This community does a good job on Twitter of actually keeping the information flow clean and in general only shares value-add information. In keeping with that endeavor, it is my intention with this post to share the shining bits and pieces that came out of each session I attended. I hope this is a welcome respite from the normal process of learning that involves hours of panning for gold in the riverbanks, only to reveal small shining flakes from time to time.
Keep in mind, this is only a summary of my biased experience, since the notes I take mainly focus on what I feel was valuable and important to me based on what I currently know or do not know about the sessions I attended at the conference. My own notes and ideas are also mixed in with the content from the sessions, as the speaker may have been triggering thoughts in my head as they progressed. I did not keep track or delineate which are their thoughts and which are my own as I took notes.
It is also very likely that I did not document some points that others might feel are valuable, as the way I garner information is different than how they would. Overall, the heuristic that Brian and I used was to treat any of the non-live sessions as a priority since we knew the live sessions would be recorded and posted to the AST YouTube page after the conference. There are many other conferences that are worthwhile to attend, like STPCon, STAR East/West, etc. and I encourage testers to check them out as well.
“Testing Fundamentals for Experienced Testers” by Robert Sabourin
- Conspicuous Bugs – Sometimes we want users to know about a problem.
- E.G. A blood pressure cuff is malfunctioning so we want the doctor to know there is an error and they should use another method.
- Bug Sampling: Find a way to sample a population of bugs, in order to tell a better story about the whole.
- E.G. Take a look at the last 200 defects we fixed, and categorize them, in order to get an idea where product management believes our business priorities are.
- Dijkstra’s Principle: “Program testing can be used to show the presence of bugs but not their absence.”
- E.G. We should never say to a stakeholder, “This feature is bug-free”, but we can say “This feature has been tested in conjunction with product management to address the highest product risks.”
- “The goal is to reach an acceptable level of risk. At that point, quality is automatically good enough.” – James Bach
- Three Quality Principles: Durable, Utilitarian, Beautiful
- Based on book Vitruvius (book on architecture and design still used today)
- Move away from centralized system testing, toward decentralized testing
- E.G. Facebook – Pushed new timeline to New Zealand for a month before releasing it to the world
- Talked about SBTM (Session Based Test Management): Timebox yourself to 60 minutes, determined what you have learned, then perform subsequent sessions by iterating on the previous data collected. In other words, use what you learn in each timeboxed session to make the next timeboxed session more successful.
- Use visual models to help explain what you mean. Humans can interpret images much quicker than they can read paragraphs of text. Used a mind map as an example.
- E.G. HTSM with subcategories and priorities
- Try to come up with constructive, rather than destructive, conversational models when speaking with your team/stakeholders.
- E.G. Destructive: “The acceptance criteria is not complete so we can’t estimate it”
- E.G. Constructive: “Here’s a model I use [show HTSM] when I test features. Is there anything from this model that might help us make this acceptance criteria more complete?
- Problem solving: We all like to think we’re excellent problems solvers, but we’re really only ever good problems solvers in a couple areas. Remember, your problem solving skill is linked to your experience. If you experience is shallow, your problem solving skill will lack variety.
- Heuristics (first known use 1887): Book “How To Solve It” by George Pólya.
- Be visual (models, mind maps, decisions charts)
- If you don’t know the answer then take a guess. Use your knowledge to determine how wrong the first guess was, and make a better one. Keep iterating until you reach a state of “good enough” quality.
- Large problems: Solve a smaller similar problem first, then try to use that as a sample to generalize so you can make hypothesis about the larger problem’s solution.
- Decision Tables (a mathematical approach using boolean logic to express testing pathways to stakeholders – see slide deck)
- AIM Heuristic: Application, Input, Memory
- Use storyboarding (like comics) to visualize what you are going to test before you write test cases
“Moving Testing Forward” by Karen Johnson (Orbitz)
- Know your shortcomings: Don’t force it. If you don’t like what you do, then switch.
- E.G. Karen moved from Performance testing into something else, because she realized that even while she liked the testing, she was not very mathematical which is needed to become and even better performance tester.
- Avoid working for someone you don’t respect. This affects your own growth and learning. You’ll be limited. Career development is not something your boss gives you, it is something you have to find for yourself.
- Office politics: Don’t avoid, learn to get good at how to shape and steer this. “The minute you have two people in a room, there’s politics.”
- Networking: Don’t just do it when you need a job. People will not connect with you at those times, if you have not been doing it all the other times.
- Don’t put people in a box, based on your external perceptions of them. They probably know something you don’t.
- Don’t be busy, in a corner, just focused on being a tester. Learn about the business, or else you’ll be shocked when something happens, or priorities were different than you “assumed”. Don’t lose sight of the “other side of the house”.
- Balancing work and personal life never ends, so just get used to it, and get good at not complaining about it. Everyone has to do it, and it will level out in the long term. Don’t try to make every day or week perfectly balanced – it’s impossible.
- Community Legacy: When you ultimately leave the testing community, which will happen to everyone at some point, what five things can you say you did for the community? Will the community have been better because you were in it? This involves interacting with people more than focusing on your process.
- Be careful of idolizing thought leaders. Challenge their notions as much as the person’s next to you.
- Goals: Don’t feel bad if you can’t figure out your long term goals. Tech is constantly changing, thus constant opportunities arise. In five years, you may be working on something that doesn’t even exist yet.
- If your career stays in technology, then a cycle or learning is indefinite. Get used to learning, or you’ll just experience more pain resisting it.
- Watch Test Is Dead from 2011, Google.
- Five years from now, anything you know now will be “old”. Are you constantly learning so that you can stay relevant?
- Be reliable and dependable in your current job, that’s how you advance.
- Act as if you have the title you want already and do that job. Don’t wait for someone to tell you that you are a ‘Senior’ or a ‘Lead’ before you start leading. Management tasks require approval, leadership does not.
- Care about your professional reputation, be aware of your online and social media presences. If you don’t have any, create them and start fostering them (Personal Website, Twitter for testing, etc.)
“Building A Culture Of Quality” by Josh Meier
- Two types of culture: Employee (ping pong tables) vs. Engineering (the way we ‘do’ things), let’s talk about the latter (more important)
- Visible (Environment, Behaviors) vs. Invisible (Values, Attributes)
- A ship in port is safe, but that’s not what ships are built for – Grace Hopper
- Pair Tester with Dev for a full day (like an extended Shake And Bake session)
- When filing bug reports, start making suggestions on possible fixes. At first this will be greeted with “don’t tell me how to do my job”, but eventually it will be welcomes as it will be a time saver, and for Josh, this morphed into the developers asking him, as a tester, to sign off on code reviews as part of their DoD (Definition of Done).
- Begin participating in code-reviews, even if non-technical
- *Ask for partial code, pre-commit before it is ready so you can supplement the Dev discussions to get an idea of where the developer is headed.
- *Taxi Automation – Scripts than can be paused, allow the user to explorer mid-way through the checks, and then the checks continue based on the exploration work done.
“Should Testers Code” (Debate format) by Henrik Anderson and Jeffrey Morgan
My Conclusion: Yes and No. No, because value can be added without becoming technical; however, if your environment would benefit from a more technical tester and it’s something you have the aptitude for, then you should pursue it as part of your learning. If you find yourself desiring to do development, but in a tester role, then evaluate the possibility that you may wish to apply for a developer position, but don’t be a wolf in sheep’s clothing; that does the product and the team a disservice.
- It takes the responsibility of creating quality code off the developer if testers start coding (Automation Engineers excluded)
- Training a blackbox tester for even 1 full hour per day for 10 months cannot replce years of coding education, training and experience. This is a huge time-sink for creation of a Jr. Dev as a best case scenario.
- The mentality that all testers should code comes from a lack of understanding about how to increase your knowledge in the skill-craft of testing. Automation is a single tool, and coding is a practice. If you are non-technical, work on training your mindset, not trying to become a developer.
My Other Observations:
- Do you want a foot doctor doing your heart surgery? (Developers spending majority time testing, Testers spending majority time developing?)
- People who say that all testers should code do not truly understand that quality is a team responsibility, but rather only a developer’s responsibility. Those that hold this stance, consciously or subconsciously have a desire to make testers into coders, and only “then” will it be their responsibility because they will then be in the right role/title. Making testers code is just a sly way of saying that a manual exploratory blackbox tester does not add value, or at least enough value, to belong on my team.
- By having this viewpoint, you are also saying that you posses the sum of knowledge of what it means to be a good tester and have reached a state of conscious competence in testing enough to make the claim that your determination of what a “tester” is, is not flawed.
- The language we have traditionally used in the industry is what throws people off. People see the title “Quality Assurance” and think that only the person with that title should be in charge of quality, but this is a misnomer. We cannot claim that the team owns quality then say that it is the tester’s responsibility to be sure that the product in production is free from major product risks. They are opposing viewpoints, neither of which address testing.
- Developers should move toward a better understanding of what it takes to test, while Testers should move toward a better understanding of what it takes to be a developer. This can be accomplished through collaborative/peer processes like Shake And Bake.
- I believe that these two roles should never fully come together and be the same. We should stay complex and varied. We need specialists just like complex machines that have specialized parts. The gears inside a Rolex watch cannot do the job of the protective glass layer on top. Likewise, the watch band cannot do the job of keeping time, nor would you want it to. Variety is a good thing, and attempting to become great at everything makes you only partially good at any one thing. Also brands like Rolex and Bvlgari have an amazingly complex ecosystem of parts. The more complex a creation, the more elegant it’s operation and output will be.
- Just like the ‘wisdom of the crowd’ can help you find the right answer (see session notes below from the talk by Mike Lyles) the myth of group reasoning can equally bite you. For example, a bad idea left unchecked in a given environment can propagate foolishness. This is why the role of the corporate consultant exists in the first place. In regards to testing organizations, keep in mind that just because an industry heads in a certain direction, it does not mean that is the correct direction.
“Visualize Testability” by Maria Kedemo
- Maria talked about the symptoms of low testability
- E.G. When Developers say, “You’ll get it in a few days, so just wait until then,” this prevents the Tester from making sure something is testable, since they could be sitting with the Devs as they get halfway through it to give them ideas and help steer the coding (i.e. bake the quality into the cake, instead of waiting until after the fact to dive into it)
- Get visibility into the ‘code in progress’, not just when it is committed at code review time. (similar to to what Josh Meier recommended, see other session notes above)
- Maria presented a new model: Dimensions of Testability (contained within her slide deck)
“Bad Metric, Bad” by Joseph Ours
- Make sure your samples are proper estimates of the population
- I tweeted: “If you bite into a BLT, and miss the slice of bacon, you will estimate the BLT has 0% bacon”
- Division within Testing Community (I see a visual/diagram that could easily be created from this)
- 70% uneducated
- 25% educated
- 5% CDT (context-driven testing) educated/aware
“The Future Of Testing” by Ajay Balamurugadas
- My main takeaway was about the resources available to us as testers.
- Ministry of Testing
- Weekend Testing meetups
- Skype Face-to-face test training with others in the community
- Skype Testing 24/7 chat room
- Udemy Coursera
- BBST Classes
- Test Insane (hold global test competition called ‘War With Bugs’, with $$cash prizes)
- Testing Mnemonics list (pick one and try it out each day)
- SpeakEasy Program (for those interested in doing conventions/circuits on testing)
- Also talked about the TQM Model (Total Quality Management)
- Customer Focus, Total Participation, Process Improvement, Process Management, Planning Process, etc.
- Ajay encouraged learning from other industries
- E.G. Medical, Auto, Aerospace, etc. by reading about testing on news sites or product risks found there. They may have applicable information that apply here.
- “You work for your employer, but learning is in your hands.” (i.e. Don’t wait for your manager to train you, do it yourself)
- Talked about the AST Grant Program – helps with PR, pay for meetups, etc.
- Reading is nice, but if you want to become good at something, you must practice it.
- Professional Reputation – do you have an online testing portfolio
- On a personal note: He got me on this one. I was in the process then of getting my personal blog back up (which is live now), but also plan to even put up some screen recordings of how I test in various situations, what models I use, how I use them, why I test the way I do, how to reach a state of ‘good enough’ testing where product risks are mitigated or only minimal ones remain, how to tell a story to our stakeholders about what was and was not tested, understanding metrics use and misuse, etc.
- “Your name is your biggest certificate” – Ajay (on the topic of certifications)
“Reason and Argument for Testers” by Thomas Vaniotis and Scott Allman
- Discussed Argument vs Rhetoric
- Argument – justification of beliefs, strength of evidence, rational analysis
- Rhetoric – literary merit, attractiveness, social usefulness, political favorability
- They talked about making conclusions based on premises. You need to make sure your premises are sound, before you try to make a conclusion based on solely conjecture that only ‘sounds’ good on the surface.
- Talked about language – all sound arguments are valid, but not all valid arguments are sound. There are many true conclusions that do not have sound arguments. No sound argument will lead to a false conclusion.
- Fallacies (I liked this definition) – a collection of statements that resemble arguments, but are invalid.
- Abduction – forming conclusion in a dangerous way (avoid this by ensuring your premises are sound)
- Use Safety Language (Epistemic Modality) to qualify statements and make them more palatable for your audience. You can reach the same outcome and still maintain friendships/relationships.
- This was really a session on psychology in the workplace, not limited to testers, but it was a good reminder on how to make points to our stakeholders if we want to convince them of something.
- If you work with people your respect, then you should realize that they are most likely speaking with the product’s best interests at heart, at least from their perspective, and not out to maliciously attack you personally. You can avoid personal attacks by speaking from your own experience. Instead of saying “That’s not correct, here’s why…” You can say “In my experience, I have found, X Y Z to be true, because of these factors…” In this way you will make the same point, without the confrontational bias.
- If you want to convince others, be Type-A when dealing with the product, but not when dealing with people. Try to separate the two in your mind before going into any conversation.
“Visual Testing” by Mike Lyles
- This was all about how we can be visually fooled as testers. Lots of good examples in the slide-deck, and he stumped about half of the crowd there, even though we were primed about being fooled.
- Leverage the Wisdom of the Crowd: Mike also did an exercise where he held up a jar of gum balls and asked us how many were inside. One person guessed 500, one person guess 1,000. At that point our average was 750. Another person guessed 200, another 350, another 650, another 150, etc. and this went on for a while until we had about 12 to 15 guesses written down. The average of the guesses came out to around 550. The Total number of gum balls was actually within 50-100 of this average. The point that Mike was making was that leveraging the wisdom of the crowd to make decisions is smarter than trying to go it alone or based on smaller subsets/sources of comparison. Use the people in your division, around you on your team and even in the testing community at large to make sure you are on the right track and moving toward the most likely outcome that will better serve your stakeholders.
- This involves an intentional effort to be humble, and realize that you (we) do not have all the answers to any given situation. We should be seeking counsel for situations that have potentially sizable product impacts and risks, especially in areas that are not in our wheelhouse.
- Choice Blindness: People will come up with convincing reasons why to take a certain set of actions based on things that are inaccurate or never happened.
“Using Tools To Improve Testing: Beyond The UI” by Jeremy Traylor
- Testers should become familiar with more development-like tools (e.g. Browser Dev Tools, Scripting, Fiddler commands, etc.)
- JSONLint – a JSON validator
- Use Fiddler (Windows) or Charles (Mac)
- Learn how to send commands through this (POST, GET, etc.) and not just use it to only monitor output.
- API Testing: Why do this?
- Sometimes the UI is not complete, and we could be testing sooner and more often to verify backend functionality
- You can test more scenarios than simply testing from the UI, and you can test those scenarios quickly if you are using script to hit the API rather than manual UI testing.
- Some would argue that this invalidates testing since you are not doing it how the user is doing it, but as long as you are sending the exact input data that the UI would send then I would argue this is not a waste of time and can expose product risks sooner rather than later.
- Gives testers better understanding of how the application works, instead of everything beyond the UI just being a ‘black box’ that they do not understand.
- Some test scenarios may not be possible in the UI. There may be some background caching or performance tests you want to do that cannot be accomplished from the front end.
- You can have the API handle simple tasks rather than reply on creating front-end logic conversions after the fact. This increases testability and reliability.
- Postman (Chrome extension) – this is an backend-HTTP testing tool that has a nice GUI/front-end. This helps decrease the barrier to entry for testers who may be firmly planted in the blackbox/manual-only world and want to increase their technical knowledge to better help their team.
- Tamper Data (addon for Firefox) – can change data as it is in route, so you can better simulate Domain testing (positive/negative test scenarios).
- SQL Fiddle – This is a DB tool for testing queries, scripts, etc.
- Other tools: SOAPUI, Advanced Rest Client, Parasoft SOAtest, JSONLint, etc.
- Did you know that the “GET” command can be used to harvest data (PII, user information, etc). Testers, are you checking this? (HTSM > Quality Criteria > Security). However, “GET” can ‘lie’ so you want to check the DB to make sure the data it returns is actually true.
- Explore what works for you and your team/product, but don’t stick your head in the sand and just claim that you are a manual-only tester. You have to at least try these tools and make a genuine effort to use them for a while before you can discount their effectiveness. Claiming they would not work for your situation or never making time to explore them is the same as saying that you wish to stay in the dark on how to become a better tester.
- Since Security testing is not one of my fortes, I personally would like to become a better whitebox hacker to aid in my skill-craft as a tester. This involves trying to gain the system and expose security risks, but for noble purposes. Any found risks then go to help better inform the development team and are used to make decisions on how the product can be made more secure. Since testers are supposed to be informers, this is something I need to work on to better round out my skill-set.
“When Cultures Collide” by Raj Subramanian and Carlene Wesemeyer
- Raj and Carlene spent the majority of the time talking about communication barriers such as differences in body language, the limitations of text-only (chat or email), as well as assumptions that are made by certain cultures about others regardless if they are within the same culture or not.
- Main takeaway: Don’t take a yes for a yes and a no for a no. Be over-communicative if necessary to ensure that the expectations you have in your head match what they have in their head.
I hope that my notes have helped you in some way, or at the very least exposed you to some new ideas and knowledgable folks in the industry from which you can learn. Please leave comments here on what area you received the most value from or need clarification. Again, these are my distilled notes from the four days I was there, so I may be able to recall more or update this blog if you feel one area might be lacking. If you also went to CAST 2015, and any of the same sessions, then I’d love to hear your thoughts on any important points I may have overlooked that would be beneficial to the community.