![]() |
Silicon Valley Software Quality Association (SSQA)Invited Guest Speakers
Home
| Speakers
| Officers
| Bylaws
| Links
| Jobs
| Membership Management
| Directions
|
DATE | SPEAKER(S) | ORGANIZATION | TOPIC |
November 13, 2012 | Doug Hoffman | Software Quality Methods, LLC. | "Self-Verifying Data" |
October 9, 2012 | Jon Bach | eBay | "Exploratory on Purpose" |
September 11, 2012 | Kaarthick Subramanian | CSC | "Mobile Application Testing – Challenges & Best Practices" |
August 14, 2012 | Paul Linares | Crosstest, Inc. | "White Box Test Automation Framework for C/C++" |
SSQA | Minutes of the first SSQA meeting, 8/18/1987 | ||
July 10, 2012 | Elisabeth Hendrickson | Quality Tree Software and Agilistry | "Expeditions to the Unknown: Discovering Surprises and Risks in Software" |
June 12, 2012 | Tammy Davis, Ken Doran, moderators | SSQA |
"Brainstorming!" (Q&A regarding SSQA, future SSQA topics and speakers) |
May 8, 2012 | Duvan Luong | Operational Excellence Networks | "Operational Excellence for SQA" |
April 10, 2012 | Doug Hoffman | Software Quality Methods, LLC. | "Exploratory Test Automation" |
March, 2012 |
None |
No SSQA Meeting This Month | |
February 9, 2012 | Brian Lawrence | Coyote Valley Software | "How to use lies to get the quality you need" |
January 10, 2012 |
Sandeep Khabiya
|
Hewlett-Packard Wry Toastmasters Club | "Conquer your Speaking Fear" |
Jon Regan | Hewlett-Packard Wry Toastmasters Club | "Powerful Speaking Without Preparation" | |
November 8, 2011 | Karen Burley | Hewlett-Packard | "Can QA Innovate?" |
Mary Ann May-Pumphrey | Adobe EchoSign | "Converting from Selenium/Perl to Selenium/Python with the Page Object Model: Four Very Useful Aids" | |
Doug Hoffman | Software Quality Methods, LLC | "Self-Verifying Data" | |
October 11, 2011 | Mary Ann May-Pumphrey, moderator | SSQA! | "SSQA in 2012!" |
September 13, 2011 | Don Miller | Product Life Cycle Process Architect, PayPal | Quality Before Code: The Beginning of the Product Life Cycle |
Karen Burley | Engineering Section Manager, HP | A Case Study in Agile QA | |
August, 2011 |
None |
No SSQA Meeting This Month | |
July 12, 2011 | Jeff Richardson | Chief Transformational Engineer | Empowered Alliances |
June 14, 2011 |
Hemant Gaidhani |
Senior Technical Marketing Manager, VMware |
Performance & Scalability Testing Virtual Environment |
May 10, 2011 |
Jane Fraser |
QA Director, Pogo |
To Fix or Not to Fix - that is the Question |
Adrienne Hunter |
QA Manager, PACE Anti-Piracy |
Is Sikuli a viable QA tool? |
|
Susan McVey |
Software Quality Engineer, IBM Silicon Valley Lab |
Starting a Geographically Distributed Test Team |
|
Tim Stein |
CEO / President, Business Performance Associates, Inc. |
Verification in the Development of Medical Device Software Per IEC 62304 |
|
Forest Weld |
Director of QA & Support, Arxan Technologies |
Test/Support Synergy |
|
Malvika Agrawal |
Software Quality Engineer, IBM |
Book review of: "Implementing Automated Software Testing: How to Save Time and Lower Costs While Raising Quality" |
|
April 12, 2011 |
Adam Christian |
JavaScript Architect, Sauce Labs |
Windmill - The Selenium Oppugner |
March 8, 2011 |
Henry Cate |
QA Engineer, Teradata Corp. |
Monkey: Tool for Generating Random SQL Test Cases |
February 8, 2011 |
Jon Bach |
Director of Quality Engineering, eBay's Search and Discovery team |
Applying Creative Thinking to Quality and Testing Problems |
January 11, 2011 |
Nirmala Anisetti |
Quality Engineering Manager, Yahoo! |
Security Testing: Paranoid Approach |
December 14, 2010 |
None |
No SSQA meeting this month |
Happy Holidays, See you in 2011! |
November 9, 2010 |
None |
No SSQA Meeting This Month |
|
October 12, 2010 |
Doug Hoffman |
Software Quality Methods LLC, ASQ Fellow |
Why Tests Don't Pass |
September 14, 2010 |
Sandeep Bhatia |
Sr. Development Manager for Quality, Intuit |
How Agile Takes Care of Quality |
August 10, 2010 |
None |
Meeting cancelled due to speaker illness |
|
July 13, 2010 |
Nixon Augustin |
Software Engineer, Brocade Communications Systems |
Introduction to TestLink |
June 8, 2010 |
Alex Pineda |
Sr Software Development Manager, Oracle Corp. |
The Myths and Pitfalls of QA |
May 11, 2010 |
Sunita Vaswani |
Quality Engineer, IBM Rational |
Effective Application of Software Test Automation at IBM Rational |
April 13, 2010 |
Ramesh Mandava |
Engineering Manager, eBay |
Pushing Quality Upstream at eBay |
March 9, 2010 |
Jeff Richardson |
Chief Transformational Officer, Empowered Alliances |
The Secrets of Successful Networking - Expanding Your Professional Network |
February 9, 2010 |
Susan McVey |
Software Quality Engineer, IBM Rational |
Software Testing for the Long Term |
January 12, 2010 |
None |
SSQA meeting cancelled due to scheduling conflict |
|
December 8, 2009 |
None |
No SSQA meeting this month |
Happy Holidays, See you in 2010! |
November 10, 2009 |
Nixon Augustin |
Software Engineer, Brocade Communications |
Tutorial on Using STAF (open source test automation framework) |
October 13, 2009 |
Rutesh Shah |
Founder/CEO, InfoStretch Corp. |
Enterprise 2.0 and Testing Challenges |
September 8, 2009 |
Tim Riley Murali Nandigama |
Director of QA, Mozilla Corp. Consultant |
Effective Gap-Centric Test Development Strategies Using Code Coverage and Test Case Mapping |
August 11, 2009 |
Jagadesh Munta |
Sr. Software Engineer, Sun Microsystems |
Taking Quality to Developer Desktop: Java Static Analysis with FindBugs |
July 14, 2009 |
Ken Doran |
Administrative Systems, Stanford University |
Building Your Software QA Library |
June 9, 2009 |
Jonathan Lindo |
Co-founder/CEO, Replay Solutions Inc. |
TiVo(tm) for Software, the future is now! |
May 12, 2009 |
T Ashok |
Founder/CEO, STAG Software (Bangalore India) |
Hypothesis-Based Testing |
April 14, 2009 |
Nixon Augustin |
Software Engineer, Brocade Communications |
Multi-Client Testing Using STAF (open source, test automation framework) |
March 10, 2009 |
Mary Ann May-Pumphrey |
Software QA Engineer and DeAnza College Instructor |
Automated Web Page Testing with Selenium IDE: A Tutorial |
February 10, 2009 |
Russell Pannone |
Agile Product Development Practitioner and Coach |
Quality Assurance in the World of Agile Systems / Software Development |
January 13, 2009 |
None |
SSQA meeting cancelled due to scheduling conflict |
|
December 9, 2008 |
None |
No SSQA meeting this month |
Happy Holidays, See you in 2009! |
November 18, 2008 |
Satya Dodda |
Director of Software Quality Engineering, Sun Microsystems |
Test-Driven Development Best Practices |
October 14, 2008 |
John Green |
Sr. Staff Engineer, VMware |
Success with Test Automation |
September 9, 2008 |
Yashwant Shitoot |
Consultant |
Beyond Testing - Achieving Software Excellence |
August 12, 2008 |
Sriram Lakkaraju |
QA Manager, Sun Microsystems |
How To Ensure Enterprise Software Is Highly Available |
July 8, 2008 |
Jim Singh |
Director of Technology, VMLogix |
Test Lab Virtualization |
June 10, 2008 |
Sachin Bansal |
Senior Quality Engineering Manager, Adobe Systems |
Total Automation! |
May 13, 2008 |
Samir Shah |
Founder/CEO, Zephyr |
Enterprise 2.0 is here - Upgrade your Test Department! |
April 8, 2008 |
Tim Riley |
Director of Quality Assurance, Mozilla Corporation |
The SQA Approach on the Mozilla Project - How Firefox gets Tested |
March 11, 2008 |
Madhava Avvari |
QA Manager, Ad Serving Systems, Yahoo! |
Testing a Cool Internet Technology called "Ad Serving Systems" |
February 12, 2008 |
Peter Jensen |
Software Architect, Sun Microsystems |
Clichés, Metrics, and Methods: A Discussion of the Quality System and its Role
in Contemporary Software Development |
January 8, 2008 |
Doug Hoffman |
Consultant, Software Quality Methods LLC |
Exploring an Expanded Model for Software Under Test |
December 11, 2007 |
None |
No SSQA meeting this month |
Happy Holidays, See you in 2008! |
November 13, 2007 |
Murali Nandigama |
Senior Development Manager, Oracle Corporation |
Security in the Software Development Life Cycle |
October 16, 2007 |
Kowsik Guruswamy |
Co-founder and CTO, Mu Security |
Life is not static, so why are your test cases? |
September 11, 2007 |
Doug Hoffman |
Software QA Program Manager, Hewlett-Packard |
A Graphical Display of Testing Status for Complex Configurations |
August 14, 2007 |
Tim Riley |
Director of Quality Assurance, Mozilla Corporation |
Testing in the World of Open Source Software |
July 10, 2007 |
Gopal Jorapur |
Engineer, Sun Microsystems |
Closing the Loop On Quality - Integrating Customer Feedback |
June 12, 2007 |
Cédric Beust |
Senior Software Engineer, Google |
Next Generation Testing with TestNG |
May 8, 2007 |
Larry Steinhaus Sujoy Ghosh |
Program Manager, NonStop Div., Hewlett-Packard Program Manager, NonStop Div., Hewlett-Packard |
Developing and Using a Defect Removal Model to Predict Customer Experiences on Software Products |
April 10, 2007 |
Brian Lawrence |
Principal, Coyote Valley Software |
The Software Project as a Journey |
March 13, 2007 |
Anita Wotiz |
Program Coordinator, Software Engineering, UCSC Extension |
Requirements Management, an Integral part of Quality Release |
February 13, 2007 |
Sachin Bansal |
Senior Quality Engineering Manager, Adobe Systems |
How to Design Regression Test Automation Frameworks for System Testing |
January 9, 2007 |
David Roland |
Sr. Computer Scientist, Computer Sciences Corp., NASA Ames Research Center |
Keeping Score - How to Know When You're Done |
December 12, 2006 |
Tim Stein, Jason Reid, Alka Jarvis, James Cunningham |
Local Authors |
Book Signing and Holiday Festivities with Local Authors |
November 14, 2006 |
Doug Hoffman |
Program Manager, Hewlett-Packard |
Test Automation Beyond Regression |
October 10, 2006 |
Hans Buwalda |
Chief Technology Officer, LogiGear Corp. |
The 5 Percent Rules of Test Automation |
September 12, 2006 |
Mark Himelstein |
President, Heavenstone Inc. |
Who is Responsible for Quality? |
August 8, 2006 |
Duy Bao Vo |
Graduate Student, San Jose State University |
Quality-Driven Build Scripts for Java Applications |
July 11, 2006 |
David Hsiao |
Director, Metrics Strategy and Benchmarking COE, Cisco Systems |
Metrics, Benchmarking and Predictive Modeling |
June 13, 2006 |
Duvan Luong |
Technical Director for Enterprise Quality, Cadence Design Systems |
Fighting the BUG WAR with Inspections and Reviews: A Success Story |
May 9, 2006 |
Lisa K. Arnold Anu Ranganath |
Customer Satisfaction Analyst, Cisco Systems Voice-of-the-Customer Program Manager, Cisco Systems |
Using Data-driven Analysis to Increase Customer Satisfaction |
April 11, 2006 |
Mikhail Portnov |
Founder, Portnov Computer School |
Software Testing as a Career – Still Viable? |
March 14, 2006 |
Brian Lawrence |
Principal, Coyote Valley Software |
Software Engineering: Facts or Fancy? |
February 14, 2006 |
Claudia Dencker Doug Hoffman |
President, Software SETT Corporation President and Principal Consultant, Software Quality Methods, LLC |
QA Road Warriors |
January 10, 2006 |
Lew Jamison |
CEO / Learning Strategist, Performance Improvement Circle |
The T in Quality |
December 13, 2005 |
Lew Jamison |
CEO / Learning Strategist, Performance Improvement Circle |
Quality Training: What's been your experience? |
November 8, 2005 |
Dr. George Bozoki |
Founder, Target Software |
Estimating Software Size |
October 11, 2005 |
Jeff Jacobs |
Covad Communications, Jeffrey Jacobs and Associates |
Logical Entity/Relationship Modeling: The Definition of Truth for Data |
September 13, 2005 |
Aditya Dada |
Sun Microsystems |
Automation Techniques for Enterprise Application Testing |
August 9, 2005 |
Keith Mangold |
QAnalysts |
A Process Driven Approach for Effective Application Service Quality for IT Organizations |
July 12, 2005 |
Doug Hoffman |
SDT Corporation |
Early Testing Without the "Test and Test Again" Syndrome |
June 14, 2005 |
Jason Reid |
Sun Microsystems |
Sabotaging QA: a Primer |
May 10, 2005 |
Robert Konigsberg |
Network Evaluation |
The State of Spyware |
April 12, 2005 |
Andy Tucker and Keith Wesolowski |
Sun Microsystems |
Solaris and Open Source - Current Status |
March 8, 2005 |
Dave Segleau |
Sleepycat Software |
QA and Open Source - The Good, the Bad and the Ugly |
February 8, 2005 |
Roundtable Discussion |
SSQA Membership |
War Stories from the Ground Level |
January 11, 2005 |
Yana Mezher Dave Weir Dave Liebreich |
Software SETT Corp. Calavista Yahoo |
Tips for Managing an Offshore Team from Three Who Know |
December 14, 2004 |
Sean Nihalani |
UC Santa Cruz Extension |
Outsourcing in Software Engineering |
November 09, 2004 |
Tim Stein |
Business Performance Associates |
Part 11 - Electronic Records and Electronic Signatures: Review of the Regulation and a Discussion of Issues |
October 12, 2004 |
Damien Farnham |
Senior Manager, Solaris Performance, Sun Microsystems |
Why Performance QA is Broken and How to Fix it |
September 14, 2004 |
William Estrada |
Mt Umunhum Wireless |
SNMP: A primer |
August 10, 2004 |
Peter Yarbrough |
Software SETT Corp. |
Staying Relevant in a Competitive Market |
July 13, 2004 |
Merrin Donley |
Silicon Valley Workforce Investment Board |
Market Based Job Searching |
June 8, 2004 |
Gail Lowell |
InPhonic |
Test Variables Impacting Wireless Applications |
May 11, 2004 |
Rhonda Farrell and Jason Reid |
Self and Sun Microsystems |
Security Testing |
April 13, 2004 |
Ross Collard |
Collard & Company |
Realistically Estimating Test Projects |
March 9, 2004 |
Panel Discussion |
SSQA Membership |
War Stories at the Ground Level |
February 10, 2004 |
Elisabeth Hendrickson |
Quality Tree Software |
Roll Your Own .NET Automated Tests |
January 13, 2004 |
Rob Robason |
Intrinsyx Technologies |
A Case Study in Best Practices in Software Process Documentation: Space Station Software Project Measurement and Analysis |
December 9, 2003 |
Jeff Jacobs |
Jeffrey Jacobs & Associates |
Capability Maturity Model for Software (CMM) |
Some tests require large data sets. The data may be database records, financial information, communications data packets, or a host of others. The data may be used directly as input for a test or it may be pre-populated data as background records. Self-verifying data (SVD) is a powerful approach to generating large volumes of information in a way that can be checked for integrity. The presentation describes three methods for generating SVD, two of which can be used easily. Topics include:
- What is self-verifying data?
- Why and how self-verifying data can be used
- Applications where such data is useful
- Ways to apply self verifying data
- Check data records generated this way
Speaker: Doug Hoffman, (President, Software Quality Methods, LLC.)
Doug has more than thirty years of experience with software engineering and quality assurance. Today he teaches and does management consulting in strategic and tactical planning for software quality. Training specialties include context-driven software testing, test oracles, and test automation design. His technical specialties include test oracles, test planning, automation planning, and developing test architectures. Management specialties include ROI based planning of software engineering mechanisms, QA management, organizational assessment, evaluating and planning for organizational change, managing and creating project management offices, building new quality organizations, and transforming existing quality assurance and testing groups to fulfill corporate visions.
He has been the Chair of the Silicon Valley section of the American Society for Quality (ASQ) and is the current Chair the Silicon Valley Software Quality Association (SSQA). He's a founding member and past Member of the Board of Directors of the Association for Software Testing (AST), as well as a member of ACM and IEEE, and a Fellow of ASQ.
----------------------------------------
Contrary to what you might think, exploratory testing has structure, much in the same way jazz has structure or a conversation with a friend has structure. It may not seem like it, but once you know what to look for, you won’t be able to ignore it. This presentation is about simple methods, tactics, and tools you can use to leverage the structures in exploration to quickly find important bugs, while responding to any scrutiny about what you did to find them.
“If we are alert, with minds and eyes open, we will see meaning in the commonplace; we will see very real purposes in situations which we might otherwise shrug off and call ‘chance’.” -- from a lecture Jon’s grandfather Roland Bach.
Speaker: Jon Bach (Director, Live Site Quality, eBay)
Jon's role at eBay is to coordinate efforts to find production bugs on eBay's related sites and take measures to improve customer experiences through bug advocacy. A veteran of more than a dozen STAR conferences, he delights in telling stories from 17 years in testing for companies like LexisNexis, Hewlett-Packard, and Microsoft. He is an award-winning keynote speaker and blogger, but his claim-to-fame is as co-creator of Session-Based Test Management —a way to manage and measure efforts from exploratory testing.
----------------------------------------
Mobile Application Testing – Challenges & Best Practices
As mobile adoption continues to grow, from enterprise applications to consumer applications, companies recognize the potential to boost revenue, decrease costs and reach out efficiently to their customer. However, the mobile market is becoming increasingly competitive and complex. The huge diversity of devices, operating systems, OS versions, carriers etc. makes it virtually impossible to sustain reasonable quality standards for their mobile applications and websites across platforms. The complexity starts with multitude of mobile devices having different screen sizes, hardware configurations and image rending capabilities. In addition, the proliferation of operating platforms and the hundreds of mobile phone carriers worldwide working on diverse local network standards (GSM, CDMA etc.) is putting further strain on development teams. Hence, testing becomes more complicated and mobile QA is becoming increasingly expensive. This presentation addresses mobile QA challenges, such as minimizing the cost of mobile QA without compromising on application quality, mobile QA test infrastructure, and best practices in the area.
Speaker: Kaarthick Subramanian (Regional Practice Leader, Global Applications, CSC)
Kaarthick Subramanian currently serves in the Independent Testing Services organization at CSC, where he leads the Banking, Financial Services and Insurance (BFSI) testing practice. Earlier, at Polaris Software Lab Ltd (financial technology company), Kaarthick served as VP and Head of Testing Engagements (Strategic Accounts). Kaarthick's engagement with Fortune 500 companies has involved deploying strategies, techniques and tools around test management, functional testing, and the automation of enterprise applications. Kaarthick has helped organizations like Polaris and Lionbridge (and their clientele) build efficient practices in the QA and Test Management space. You may reach Kaarthick at ksubramani30@csc.com.
----------------------------------------
White Box Test Automation Framework for C/C++
White box testing can be time consuming and potentially reduce product quality if done in the dark. Integrating test design, setup, execution, and analysis in an integrated solution can reduce the test time and increase quality by facilitating and visualizing code and tests throughout development and debugging. The talk will present a white box test framework that tackles the problem of insufficient early stage testing. A demonstration of the framework will be part of the presentation.
Speaker: Paul Linares (VP, Customer Solutions, Crosstest, Inc.)
Paul Linares serves as VP Customer Solutions at CrossTest. Earlier, at Atempo, a data protection and backup software solution provider, Paul was in charge of customer satisfaction as world-wide customer support Vice President. Previously, Paul was Vice President of Operations at NetCentrex, a voice over IP leader, and held Vice President of Engineering positions in several startups (on-line training and on-line banking). Prior to that, he has helped the rapid expansion of CETIA, a subsidiary of Thales, a 12 billion Euros electronics, defense and aerospace corporation. Paul earned a BSEE from Ecole Nationale de l'Aéronautique et de l'Espace and a MS from the California Institute of Technology where he specialized in Computer Science and Electrical Engineering.
----------------------------------------
This meeting was the 25th anniversary of the first meeting of what would become SSQA-SV. Doug Hoffman provided a copy of the minutes from the first meeting for the "Software Quality Assurance 'Creative Force' ".
Expeditions to the Unknown: Discovering Surprises and Risks in Software
(Note: the above link is to a very large .pdf file on the web. Hopefully we will have a smaller, local copy for easier download shortly.)Exploratory testing involves learning about the software while simultaneously designing and executing tests, using feedback from the last test to inform the next. It's an approach that reveals risks and vulnerabilities no one thought about or could even have predicted in advance. When combined with automated unit and system level tests, it leads to high quality software with significantly fewer post-deploy surprises. In this session, you'll discover how you can systematically explore to discover surprises at all levels, from GUIs to code.
Speaker: Elisabeth Hendrickson (Founder, Quality Tree Software and Agilistry)
Elisabeth Hendrickson wrote her first line of code in 1980 on a TRS Model I and has been hooked on software development ever since. She's the founder and president of Quality Tree Software, Inc., a consulting and training company dedicated to helping software teams deliver working solutions consistently and sustainably. She also founded Agilistry Studio, a practice space for Agile software development in Pleasanton, CA. She served on the board of directors of the Agile Alliance from 2006 - 2007 and is one of the co-organizers of the Agile Alliance Functional Testing Tools program. Elisabeth splits her time between teaching, speaking, writing, and working on Agile teams with test-infected programmers who value her obsession with testing. You can find her on Twitter as @testobsessed and at http://www.qualitytree.com.
June 12, 2012 - SSQA brainstorming meeting 12, 2012
This meeting was a discussion led/moderated by Tammy Davis Mary Ann, generally a brainstorm for group meeting topics and discussion of various ideas regarding the group, it's purpose, and ideas for the future. Here are notes from the meeting.
Operational Excellence for SQA
Speaker: Duvan Luong (President, Software Quality Methods, LLC.)Duvan Luong received his PhD in Information and Computing Science from Lehigh University in Pennsylvanioa. Duvan has 30 years of practicing operational excellence at IBM, Synopsys, Sun, HP, and Cadence, both as an individual technical contributor and in management. Drawing on his lifelong experiences in organization and business operational improvement, Duvan has authored the operational excellence methodology and framework, providing implementation guidelines and practices for use by those wishing to implement operational excellence in their companies. Further, Duvan has founded the Operational Excellence Networks to assist companies in their implementation efforts. (Contact Duvan at: operationalexcellencenetworks.com@gmail.com.)
Doug Hoffman is a management consultant in testing/QA strategy and tactics.
How to use lies to get the quality you need
People and organizations sometimes use lies to get the level of quality that they deliver. Beyond the "standard" lying techniques, one could categorize these lies as in paradigms, of logic, and with numbers. We'll examine these, and perhaps learn to use some of them to get the quality we need. This subject applies beyond software quality.
Speaker: Brian Lawrence (Principal, Coyote Valley Software)
Brian Lawrence is a software consultant with a long history in the computing industry. His primary focus is teaching and facilitating requirements activities, as well as inspection, project planning, risk management, life cycles, and design specification techniques.
Brian has served as the editor of what is now Better Software Magazine, sat on the editorial board of IEEE Software for 5 years, and has chaired conferences. He taught software engineering at the University of California Santa Cruz Extension for over a decade. Contact: brian@coyotevalley.com.
January 10, 2012 - "Conquer Your Fear of Public Speaking"
This month's speakers from HP's Wry Toastmasters Club gave 2 presentations (see below) and involved the attendees in practice exercises geared towards improving one's own public speaking as well as overcoming all of those mental or emotional blocks that seem to be in the way.
For more information on Toastmasters, visit these sites:
http://Wry.freetoasthost.us/
http://www.toastmasters.org/
November 8, 2011 - "Lightning Talks!"
The meeting consisted of 3 Lightning Talks where each speaker had 5 minutes for a mini-presentation (talk or review), and then another 5 minutes for Q&A.
Bio: Karen Burley is an Engineering Section Manager at HP Software, managing several QA teams working on Enterprise Archiving products. Karen has over 25 years of software development and management experience ranging from real-time embedded microprocessor-based products to mission-critical data center system software to Software-as-a-Service, with a strong focus on process improvement and test automation. Karen has helped developed numerous products with multi-national teams in the US, India, Slovenia, Japan, and China, with teams that ranged from 2 to over 200. Karen has a BS degree in Computer Science from the University of Illinois with graduate work at Northwestern University. Creating high-quality products that contribute significantly to the company’s revenue and growth, that provide a great customer experience and yield high customer satisfaction, is Karen's passion.
Whether we are doing manual or automated testing, we tell how well the software under test behaves using some kind of oracle. Sometimes it is not obvious what the outcome should be. For example, we can generate large volumes of test data using programs and scripts. One problem with that generated data is confirming that it remains intact after we run the test exercise. One approach to address the oracle problem is to embed the answer in the test data itself - called self-verifying data. The talk presents some ways to view, generate, and use self-verifying data oracles.
October 11, 2011 - "SSQA in 2012!"
This meeting was a discussion led/moderated by Mary Ann May-Pumphrey, the SSQA Programs Director. It had an agenda of:
topics and speakers for the meetings of 2012
meeting format(s)
every-other-month meetings rather than monthly ones
meeting cancellation criteria
use a meetup.com site for SSQA
SSQA member involvement in serving as speakers and/or officers
how to decide whether to have speakers back (and if so, how soon)
how and when to hold officer elections [existing process in the Bylaws, 4.3]
These are the resulting notes captured during the meeting.
The software product life cycle begins from the moment an idea is considered. The path to quality is chartered long before the design, code, test, and release activities that we traditionally focus on. This talk key decisions in the first steps of the life cycle were considered to understand how quality can be determined before the first line of code is written.
Bio: Don Miller designs, deploys, and manages processes and
tools for systematic innovation. As the Product Life Cycle Process
Architect for PayPal, he is responsible for the development and continuous
improvement of the common framework that PayPal uses to deliver simple and
secure online and mobile payment solutions. Don has 20 years of
experience leading development, QA, process, and infrastructure efforts at
Silicon Valley high-tech companies including Sun Microsystems, eBay, and
Intuit. Recently, his work at PayPal has exposed him to the front
end of the life cycle. Don earned a BS in Computer Engineering from
the University of Illinois and an MBA from San Diego State University.
Bio: Karen Burley is an Engineering Section Manager at HP Software, managing several QA teams working on Enterprise Archiving products. Karen has over 25 years of software development and management experience ranging from real-time embedded microprocessor-based products to mission-critical data center system software to Software-as-a-Service, with a strong focus on process improvement and test automation. Karen has helped developed numerous products with multi-national teams in the US, India, Slovenia, Japan, and China, with teams that ranged from 2 to over 200. Karen has a BS degree in Computer Science from the University of Illinois with graduate work at Northwestern University. Creating high-quality products that contribute significantly to the company’s revenue and growth, that provide a great customer experience and yield high customer satisfaction, is Karen's passion.
The Team Effectiveness "Learning Lab"
(Transforming Personal & Team Effectiveness)
Speaker: Jeff Richardson (Chief Transformational Engineer, Empowered Alliances)This will be an interactive exploration of what drives teams crazy (and what to do about it). Every team has the capability to achieve extraordinary performance results. So why do so few ever realize this potential? We know why. There will be an introduction to some of the fundamental challenges associated with how teams make decisions and how they can be overcome. We'll examine your brain through the lens of neuroscience research to understand its strengths and limitations, then relate it back to specific challenges pertaining to creativity, roles, change, team processes and trust.
June 14, 2011
Performance & Scalability Testing Virtual Environment
Speaker: Hemant Gaidhani (Senior Technical Marketing Manager, VMware)Virtualization is becoming the new paradigm for application deployment in data centers. This talk discusses how virtualization impacts performance, and in turn affects performance & scalability testing. The talk will list the common pitfalls and best practice recommendations when performance and scalability testing in virtual environment.
May 10, 2011 - "Software QA Lightning Talks"
April 12, 2011
Windmill - The Selenium Oppugner
Speaker: Adam Christian (JavaScript Architect, Sauce Labs)
Get a different perspective on functional web testing! Windmill is a web testing tool designed to let you painlessly automate and debug your web applications. Windmill took a different route than Selenium, and you might find it refreshing. Windmill seeks to make test writing easier, portable, and sustainable. Gain some insight into the evolution and future of automated testing tools!
June 9, 2009
TiVo(tm) for Software, the future is now!
Speaker: Jonathan Lindo (Co-founder/CEO, Replay Solutions Inc.)The concept of 'TiVo for Software' has been described by some as a potential Holy Grail for application teams. Software record/replay systems have existed in various forms for several years. All of these are designed to give a deeper insight into the inner workings of your applications, while at the same time allowing teams to spend less time setting up and reproducing the original conditions your software was running in. In 2009, this technology has finally reached the point where it can be broadly deployed across the entire software lifecycle, from development, QA, staging, and in production with live customers.
In this talk, we will walk through the evolution of record/replay systems, look at what's now currently available, and examine the types of problems that are being solved today with this technology. We will also explore how record/replay technology is dramatically changing the way that software problems are solved, changes are deployed, and data centers are managed.
May 12, 2009
Hypothesis-Based Testing
Speaker: T Ashok (Founder/CEO, STAG Software, Bangalore India)The field of software testing is littered with jargon, process models, and tools. Although there's been significant progress in this field in recent years, we find it difficult to provide a logical means to get closer to perfection. A typical approach to testing based on the activity-based model consists of strategizing, planning, designing, automating, executing, and managing. Over the years, we've moved from completing these activities in one go, into an agile version consisting of these activities done in short increments. Yet the notion of "guarantee" seems elusive.
In this talk, the intent is to examine a different approach that can guarantee the quality of software. "Guarantee" here implies that the deployed software will not cause business loss. It's generally understood that testing is a process of uncovering defects that's accomplished via a good mix of techniques, tools and people skills. To make guarantees, it's imperative that the approach to evaluation be sharply goal-focused. Goal-focused evaluation means that we should have clarity as to what potential defects we need to go after. Once the potential defects are discerned by employing a scientific approach, it's possible to arrive at an effective validation strategy, a complete set of test cases, better measures of cleanliness (quality), and appropriate tooling.
Hypothesis-based testing is built on the core theme of hypothesizing potential defects and then scientifically constructing a test strategy and test cases, measures, and tooling. Hypothesis-based testing is powered by STEM 2.0 (STAG Test Engineering Method), STAG's defect detection technology and has been adopted by various customers over the last 8 years. The business benefits derived by applying STEM are a reduction in development/test effort, lower software support costs, and accelerated development.
April 14, 2009
Multi-Client Testing Using STAF
Speaker: Nixon Augustin (Software Engineer, Brocade Communications)The Software Testing Automation Framework (STAF) is an open source, multi-platform, multi-language framework designed around the idea of reusable components or services (such as process invocation, resource management, logging, and monitoring). STAF removes the tedium of building an automation infrastructure, thus enabling one to instead focus on building the automation solution. This presentation will discuss test automation challenges faced at Brocade's Files business unit and how using STAF helped Files to achieve its automation goals.
March 10, 2009
Automated Web Page Testing with Selenium IDE: A Tutorial
(presentation slides -- PDF)
Speaker: Mary Ann May-Pumphrey (Software QA Engineer and DeAnza College Instructor)Increasing global competition and a poor economy worldwide are forcing companies to look even harder at open-source solutions. Selenium is an increasingly popular choice for the automation of web page testing. This tutorial is a 75-minute condensation of Mary Ann's 30-hour class on Selenium IDE, which she teaches at De Anza College, Santa Clara Adult Education's High Tech Academy, and the Portnov QA School. Don't expect a "Selenium Big Picture" talk or a "Selenium Rah-Rah" talk! There are a number of Selenium projects currently. However, Mary Ann firmly believes "one must walk before one can run", which in Selenium terms means that one must learn the IDE well before one can move on to RC (Remote Control).
February 10, 2009
Quality Assurance in the World of Agile Software Development
(presentation slides -- PDF)
Speaker: Russell Pannone (Agile Product Development Practitioner and Coach)
- What's Agile all about? Walk away with a solid understanding of the 6 key elements or building blocks to being agile.
- There's no QA in Agile! Find out if this is fact or fiction.
- How does Deming's PDCA Cycle fit into the world of Agile Product Development? Walk away with some tips to help transform your organization's point-of-view.
Available evidence shows that being agile has its rewards, but there are challenges your organization and teams will face when adopting agile product development. This presentation will bust the myth that there's no place for Quality Assurance in being agile. You'll learn how quality improvement is built into being agile as a result of interactive and incremental product development, daily stand-ups, sprint reviews, and retrospectives. This presentation will lay the foundation or building blocks that will enable you to gain a common understanding of what it means to be agile and to apply creative agile thinking to your systems / software development projects.
January 13, 2009
No SSQA meeting this month due to a room scheduling conflict. See you on February 10 for "QA in the
World of Agile Software Development".
December 9, 2008
No SSQA meeting this month. Happy Holidays. See you in 2009!
November 18, 2008
Test-Driven Development Best Practices
(presentation slides -- PDF)
Agile software development methods are gaining popularity. The presentation will focus on one of the agile software development methods called Test-Driven Development (TDD). The talk will provide an introduction to TDD and cover the latest terminology, including: test-code-refactor, test doubles, stubs, fakes, mocks, dependency injection, etc. The talk will also cover some of the test frameworks employed in TDD, and discuss some TDD best practices and lessons learned.Speaker: Satya Dodda (Director of Software Quality Engineering, Sun Microsystems)
October 14, 2008
Success with Test Automation
(presentation slides -- PDF)
John Green will present lessons learned and best practices from many years of developing test automation projects. Sample coding standards, processes, tools, and things to avoid will also be presented.Speaker: John Green (Sr. Staff Engineer, VMware)
September 9, 2008
Beyond Testing - Achieving Software Excellence
Although testing is an essential part of achieving superior software quality, it's not sufficient. Additional ways must be found to improve software quality. Even before the introduction of CMM, it was well recognized that the development process can have a significant impact on software quality. This means that the testing team needs to understand the development process in use and should seek to influence the process. This talk will cover several simple suggestions for the development process that have proven to have significant beneficial impact on software quality.Speaker: Yashwant Shitoot, CSQE, PMP
August 12, 2008
Enterprise Software Testing: How To Ensure Enterprise Software Is Highly Available
Today's enterprises must be available 24x7 to handle customer and partner requests. This places hard requirements on these systems to be highly available with minimum down time. These hard requirements mandate having clustered systems with hardware and software redundancy. Ensuring high availability presents complex testing challenges. Our speaker will provide an overview of highly available systems and the definition of 5-9's availability. Also, our speaker will cover high availability testing methodologies, testing tools, and testing techniques, such as load balancing and failure injection.Speaker: Sriram Lakkaraju (QA Manager, Sun Microsystems)
July 8, 2008
Test Lab Virtualization
As more enterprises and independent software vendors seek additional ways to leverage virtualization technology, Virtual Lab Automation (VLA) has emerged as an innovative solution for streamlining software development and automating the entire development and test environment setup, while utilizing existing server virtualization infrastructure. In addition, VLA improves resource utilization and efficiency while pushing products to market faster. This presentation will review the virtual test and development infrastructure and provide best practice recommendations for how VLA can add significant value to developers, testers, and IT operations staff and help drive business growth and employee productivity.Speaker: Jim Singh (Director of Technology, VMLogix)
June 10, 2008
Total Automation!
Sachin Bansal of Adobe Systems will discuss automation strategies that could be followed (regardless of product or language) to achieve the TOTAL automation of quality engineering tasks in a fast-paced software life cycle. Sachin will demonstrate different automation subsystems, how they work together, execute tests, collect data, archive data, and present data on-demand for analysis. The architecture of user-friendly regression, performance, and reliability test automation frameworks for system testing will be discussed. Sachin will also outline some challenges of and learnings from an ongoing journey in Total Automation. With the help of distributed automation, his teams have been able to reduce testing time while increasing test coverage. Sachin will present actionable suggestions for solving these critical problems, and provide a road map for successful global testing and test automation.Speaker: Sachin Bansal (Sr. Quality Engineering Manager, Adobe Systems)
May 13, 2008
Enterprise 2.0 is here - Upgrade your Test Department!
Enterprise 2.0 has been defined as: flattening an organization, making it agile and flexible; harnessing the distributed and global aspect of its structure, making it simple and transparent; and utilizing on-demand and emerging information systems, shortening time-to-market cycles. Does this describe your test department? Samir Shah takes you through what all of this means to your test department and what you could be doing to upgrade it to Enterprise 2.0. In this day and age of global outsourcing, new technologies and systems to test, newer test methodologies, SOAs and integrations, distributed computing and mashups... very little attention has been paid to bringing the test department into this new world and equipping it with the right toolsets, leaving frustrated managers with archaic, monolithic toolsets that are driven by projects and events.Speaker: Samir Shah (Founder/CEO, Zephyr)
April 8, 2008
The SQA Approach on the Mozilla Project - How Firefox gets Tested
(presentation slides -- PDF)
QA is a challenge in any organization, but open source development adds extra dimensions to that task. Come hear Mozilla's Director of QA speak about testing in the creative world of open source software and how the Mozilla Project combines the effort of 22,017 test engineers and community volunteers to bring together specific SQA strategies, tools, and infrastructure. Our speaker will cover Mozilla's approach to developing and executing both manual and automated tests, focusing on the testing of Firefox, Mozilla's award-winning web browser.Speaker: Tim Riley (Director of Quality Assurance, Mozilla Corporation)
March 11, 2008
Testing a Cool Internet Technology called "Ad Serving Systems" (think Google and Yahoo!)
This presentation will address testing one of the latest internet technologies, "Ad Serving Systems". Using a hypothetical Ad Server System as an example, the speaker will describe the goals, main functionality, key modules, and complexity of a contemporary Ad Serving System. The speaker will describe the QA challenges faced and discuss solutions (test tools and test frameworks) employed to address those QA challenges.Speaker: Madhava Avvari (QA Manager, Ad Serving Systems, Yahoo!)
February 12, 2008
Clichés, Metrics, and Methods: A Discussion of the Quality System and its Role in Contemporary
Software Development
With several well-known software development clichés as a starting point, this presentation looks at the primary forces shaping a software development project and how these forces are typically balanced in a commercial software development project. We'll discuss the implications for quality goals, system, and methods. The talk will discuss the need for lightweight and flexible processes, combined with efficient, effective and quantifiable defect containment. The presentation looks at methods and techniques to enhance efficiency and proposes simple metrics to monitor in-process and overall defect containment effectiveness. A section of the presentation (MICRO Methodology) looks at procedures and tools generally needed for any software development project, independent of development methodology, and with a potentially huge impact on team performance. The presentation takes a broad view of software development, in general, and quality assurance, in particular. Rather than attempting to provide complete and final answers to specific problems, the goal of the talk is to spark discussion, and maybe make us stop for a moment and think about how we do what we're so passionately doing. If time allows, we'll consider some slightly more philosophical aspects of development methodology and take a quick look at how Isaac Newton, Albert Einstein, Niels Bohr, and Brian Greene (superstring theorist) might have gone about developing software.Speaker: Peter Jensen (Software Architect, Sun Microsystems)
January 8, 2008
Exploring an Expanded Model for Software Under Test
Many testers think of the test exercise using the simplest model for the SUT (Input/Process/Output). More complex models facilitate the design of better tests, help us understand the nature of what we're testing, and interpret the observed results. Doug has developed an expanded model, which represents the influences on SUT behavior and the domains for possible outcomes. The talk will be a presentation and interactive discussion of the model and some of its implications.Speaker: Doug Hoffman (Consultant, Software Quality Methods LLC)
December 11, 2007
No SSQA meeting this month. Happy Holidays. See you in 2008!
November 13, 2007
Security in the Software Development Life Cycle
The best way to incorporate better security in any software development life cycle is to have a well-defined security process in place. Yet, the reality in the current marketplace is that there is a heavy emphasis on security tools while ignoring the principle of incorporating security processes. The bottom line should be "Process comes first, then tools." Better testing processes like parameter validation (using the cardinal principle that "all input from users or external systems is evil until proved otherwise") and diligent code reviews (to catch logic bombs and poor coding practices that lead to vulnerabilities) will provide more bang for the buck than spending money on security audits of production-ready systems. Understanding the fundamentals of setting in place better processes in the SDLC will be discussed in detail. Using existing open source tools to perform security analyses on the code base will be demonstrated if time permits.Speaker: Murali Nandigama (Senior Development Manager, Oracle Corporation)
October 16, 2007
Life is not static, so why are your test cases?
Protocols are curious beasts. They're the ultimate interface of any hardware/software product to the external world, be it file formats, APIs, communication protocols, RPC or command line. They're all intimately connected through re-use of constructs and patterns of vulnerabilities. Testing protocols is an effective way to both unit test as well as system test a hardware/software deployment. More importantly, by analyzing the attack surface exposed by these protocols, one can hone in on the bugs that matter for expedited remediation. In the connected world, there's not much difference between a bug and a vulnerability. This talk will discuss these constructs, vulnerability patterns, and how a voltage regulator that's IP-enabled has much in common with web services.Speaker: Kowsik Guruswamy (Co-founder and CTO, Mu Security)
September 11, 2007
A Graphical Display of Testing Status for Complex Configurations
Representing the status of software under test is complex and difficult, compounded when there are many interacting subsystems and combinations that must be tracked. This paper describes a method developed for a one-page representation of the test space for a large and complex set of product components. The latest project this was applied to had 10 interdependent variables and over 250 components. Once the components are identified and grouped, the spreadsheet can be used to show configurations to be tested, record test outcomes, and represent the overall state of testing coverage and outcomes. The paper uses a sanitized example modified from an actual test configuration.Speaker: Doug Hoffman (Software QA Program Manager, Hewlett-Packard)
August 14, 2007
Testing in the World of Open Source Software
QA is always a challenge in any organization, but open source development adds extra dimensions to that task. Come hear the Director of QA at Mozilla speak about testing in the creative world of open source software.Speaker: Tim Riley (Director of Quality Assurance, Mozilla Corporation)
July 10, 2007
Closing the Loop On Quality - Integrating Customer Feedback
Quality teams are effective at testing against software requirements, but they often don't get relevant data to feed back into their test development process. This presentation focuses on lessons learned integrating customer feedback into the quality process.Speaker: Gopal Jorapur (Staff Engineer, Sun Microsystems)
June 12, 2007
Next Generation Testing with TestNG
TestNG (http://testng.org) is an open source testing framework designed to cover all aspects of testing, from unit to functional and everything in-between. TestNG has innovative features and is geared towards professional developers in search of a testing framework that covers all styles of Java code, from mobile to enterprise. This popular test harness offers a number of enhancements relative to JUnit. Cédric's talk will illustrate several TestNG features that enable advanced testing techniques, such as: Multi-thread testing; Data-driven testing; Using groups for better organization of tests; Dependent testing; and much more.Speaker: Cédric Beust (Senior Software Engineer, Google)
May 8, 2007
Developing and Using a Defect Removal Model to Predict Customer Experiences on Software Products
Reliability is a key requirement for Hewlett-Packard's NonStop Server Systems. The software that goes into these systems has to be of the highest quality. A prediction of software quality can help better control development practices to achieve desired quality goals using available resources. Several software defect models exist in the industry. Growth models use statistical distributions to predict customer experience. Other models predict customer experience based on past project, product, and the development organization's characteristics. The speakers will share their experience of developing and using a phase-based containment model, where the effectiveness of defect removal activities is used to predict customer experience. Larry and Sujoy will describe how they implemented the model in the organization, to assess product quality in each phase in a project's life-cycle, and how quality information is aggregated to make a release level prediction of what the customers would experience. They will also share key benefits and lessons that have been learned as a result of the defect removal initiative.Speakers: Larry Steinhaus and Sujoy Ghosh (Program Managers, NonStop Division, Hewlett-Packard)
April 10, 2007
The Software Project as a Journey
There have been many comparisons between software projects and other kinds of efforts. Building a house or bridge, or some other engineering or architectural feat. Another analogy which can also be useful is viewing a software project as a kind of journey. You start out. Things happen along the way. You arrive at a destination. So what makes any journey a success? There are many possible criteria. One famous example was "to get to the moon and return safely by the end of the decade." For software projects, Brian Lawrence suggests that a worthy criteria might be "to fulfill the objectives of the sponsor." Another could be "to make tons of money." In this presentation, Brian will examine several journeys -- some where people traveled from place to place -- and some software journeys, which started with an idea and arrived at a destination. All these journeys, physical and software, either succeeded or failed. Why is it that some journeys succeed, while others fail? What are the critical success factors? Brian will assert that, for both software and physical journeys, some of the factors are exactly the same.Speaker: Brian Lawrence (Principal, Coyote Valley Software)
March 13, 2007
Requirements Management, an Integral part of Quality Release
Ambiguous, incomplete, and changing requirements are responsible for many software project failures. Therefore, requirements management becomes a key component in project success and software quality. Requirement clarity, definition (of attributes and constraints), storage, and change management are elements of requirements management that contribute to a quality product and realizing improved customer satisfaction. Managing your requirements successfully means having complete visibility and accountability so your organization understands where and what the requirements are across the software life cycle. Anita Wotiz will present an overview and key topics of requirements management, its role in the software development life cycle, and will describe how each element contributes to software quality and helps to meet project delivery dates.Speaker: Anita Wotiz (Program Coordinator, Software Engineering, UCSC Extension)
February 13, 2007
How to Design Regression Test Automation Frameworks for System Testing
(presentation slides -- PDF)
Sachin Bansal of Adobe will discuss designing modular, customizable and user-friendly regression test automation frameworks for the system testing of servers. He will explain how to identify, design, implement, and execute complex automation frameworks involving different technologies. Different frameworks will be presented to elaborate on the challenges and lessons learned. Sachin will also discuss the challenges faced during integration of the bug tracking system, test case management system, performance testing system and automation system.Speaker: Sachin Bansal (Senior Quality Engineering Manager, Adobe Systems)
January 9, 2007
Keeping Score - How to Know When You're Done
(presentation slides -- PDF)
Knowing when a product is ready to ship is one of the hardest questions companies have to answer, probably second only to its corollary, "How long will it take to build?" A simple answer, some might think facetious, is "When it passes its tests." This presentation presents a business case for doing just that -- scoring product completion via the state of its testing. Unlike the common wisdom of looking at error rates or bug report frequencies, this approach predicts the total number of tests required to exercise the product completely and keeps score of the product's readiness through a few simple measures. It is implemented via a novel approach to regression testing. As a side effect, it gives the quality team's efforts significant visibility in the product development process.Speaker: David Roland (Senior Computer Scientist, Computer Sciences Corp., NASA Ames Research Center)
December 12, 2006
Book Signing and Holiday Festivities with Local Authors
This month, we will celebrate the efforts and accomplishments of community members who have contributed to the body of knowledge in software quality and testing. Four authors will be present and each will take about 10 minutes to present the primary theme of his/her book. Afterwards, we'll enjoy holiday food and beverages and authors will be happy to sign books. Copies are available for purchase, but it's better if you buy yours beforehand. All books are available for purchase from http://www.amazon.com.Speakers: Tim Stein, Jason Reid, Alka Jarvis, James Cunningham
- R. Timothy Stein. He founded Business Performance Associates in 1994, which has consulted with more than 80 companies from 18 industry segments, particularly the medical device, pharmaceutical, biologics, and diagnostic industries. Stein wrote "The Computer System Risk Management and Validation Life Cycle". This book is the first technical manual to integrate computer system validation, risk management, and system implementation into a single, easy-to-use process. Easily understood by system users and IT professionals, this book explains basic concepts and translates them into how-to deliverables to simplify the tough decisions associated with a wide range of systems and their potential risks of failure.
- Jason M. Reid. He is a test engineer at Sun Microsystems working in the Solaris System Test group. He has also been an SQA engineer in the Developer Tools group. Reid wrote "Secure Shell in the Enterprise". This book covers the technical aspect of secure shell, but as its title states it also covers the methodology of using secure shell in a large environment. It describes what secure shell does and it also goes into very nice information about the logistics of using SSH as well as information that is directly related to SSH, like authentication, public and private keys, and numerous other aspects that help to give you an understanding on how SSH can be used in an enterprise setting.
- Alka Jarvis. She is Manager of Software Quality at Cisco Systems and a certified quality lead auditor (ISO 9000). Jarvis wrote "Inroads to Software Quality" and "Dare to Be Excellent". The first book, "Inroads to Software Quality", is used as a text book in the MBA program of Santa Clara University, UC Berkeley-Extension and UCSC-Extension. The second book, "Dare to Be Excellent", describes the successful software practices of ten companies including Intel, Texas Instruments, Cisco Systems and others.
- James A. Cunningham. He has a 25-year history in the corporate semiconductor industry, working for such powerhouse companies as TI, National Semiconductor and AMD, often at the vice president level. He holds 46 patents and has published 18 technical papers, a 200-page book on CMOS technology, and a book in Japan in the 1980s concerning the growing strength of the Japanese semiconductor industry. Cunningham has written "The Hollowing of America". This book analyzes the economic ramifications of America's growing loss of domestic manufacturing and the associated massive trade imbalance. Tells how to survive if not prosper in the face of a massive decline in the dollar.
November 14, 2006
Test Automation Beyond Regression
Most testers think of GUI based scripted regression testing when they picture test automation. This is a very limited view of the potentially vast possibilities open to us when automating tests. When we think of test automation we should first think about doing things that we can't do manually. This talk is about the limitations and how other kinds of test automation may be much more valuable.Speaker: Doug Hoffman (Program Manager, Hewlett-Packard)
October 10, 2006
The 5 Percent Rules of Test Automation
Successful automation is every test manager's dream. It will shorten time-to-market life cycles and improve quality assurance. Testers would love it too, since it can alleviate much of the boring work of executing tests and free up time to design better tests and to better follow up on test outcomes. However, practice is reluctant. Most test automation tools either end up on the shelf collecting dust or are at best used for a small part of the testing. When is test automation successful? In this talk, Hans will argue that stable automation is only achieved if a large percentage of test cases can be executed automatically and the automation does not take too much time away from the testers. To provoke the issue, he will challenge us with the following two rules: (1) No more than 5 % of all tests should be executed manually, and (2) No more than 5 % of all efforts around testing should involve automating the tests. Apart from presenting the rules and why he feels they are important, Hans will mostly talk about how to meet the rules, using actual projects to illustrate a number of key principles to drive automation success: (1) test design, (2) automation architecture, and (3) organization. He will introduce his Action Based Testing framework as an example of a methodology with which the 5 % standards can be achieved.Speaker: Hans Buwalda (Chief Technology Officer, LogiGear Corporation)
September 12, 2006
Who is Responsible for Quality?
Since the internet boom, small and large companies alike are driving their teams to work faster. Since the tech-bust, small and large companies are asking their teams to do their work with less people. The squeeze is on, teams miss their goals, software is shipped before its time and, in the end, the customers suffer. Whose job is it to make sure we produce quality software? How do we build quality into the planning and development process? How do we avoid creating adversarial situations between those trying to meet company goals and those trying to ensure quality? Come and hear Mark talk about his experiences and answers to these questions!Speaker: Mark Himelstein (President, Heavenstone Inc.)
August 8, 2006
Quality-Driven Build Scripts for Java Applications
(presentation slides -- PDF)
Agile build scripts not only compile code but also provide important information about the product. These documents and metrics vary in their objectives, but they facilitate the attainment of a common goal: building quality software. Unlike physical goods, obtaining information about applications is relatively inexpensive and disproportionately beneficial to engineering efforts. We'll explore open-source, off-the-shelf utilities for creating quality-driven build scripts for Java applications.Speaker: Duy Bao Vo (Graduate Student, San Jose State University)
July 11, 2006
Metrics, Benchmarking and Predictive Modeling
How does a company measure and deliver on customer success? In this interactive session, David will share how Cisco links Customer Lifetime Value, Customer Loyalty, Customer Satisfaction, Product / Service Satisfaction, Product / Service Quality Experience, and Product / Service Design. He will be eliciting feedback on how to set customer experience targets based on customer explicit and derived needs, competitive pressures, industry best practices, and company process capability. We will discuss the role and approaches to benchmarking in helping set these targets as well as the precision and key elements to predictive modeling needed to assure that the company delivers on these measurable goals. Please come to this engaging session to learn, share, and discuss these key elements to measuring and delivering on customer success.Speaker: David Hsiao (Director, Metrics Strategy and Benchmarking COE, Cisco Systems)
June 13, 2006
Fighting the BUG WAR with Inspections and Reviews: A Success Story
Understanding the quality of your systems is best aided by proper defect classification and analysis so that the right practices are followed, the right policies are chosen, and the tools are used in the right way. Through the use of careful logging and analysis of the inspection and review results, learn how Cadence applies the lessons learned to reduce its engineering costs. Topics covered are:Speaker: Duvan Luong, PhD (Technical Director for Enterprise Quality, Cadence Design Systems)
- Strategy for fighting the bug war
- Bug profiles
- Leveraging the best bug fighting practices
- Organizational preparation
- Analysis of the results
May 9, 2006
Using Data-Driven Analysis to Increase Customer Satisfaction
(presentation slides -- PDF)
Learn how the Cisco Systems Technical Services Group uses a combination of statistical tools and integrated data analysis to identify key drivers of customer satisfaction and loyalty.Speakers: Lisa K. Arnold (Customer Satisfaction Analyst, Cisco Systems)
Topics that will be covered are:
* Apply various statistical methods, such as:
o Qualitative data integration using Linkage Analysis
o Multiple regression analysis modeling
o Other processes borrowed from the Six Sigma toolbox
* Present data that gets attention
* Translate feedback into improvement initiatives
* Monitor progress and measure success
* Apply what we learned
April 11, 2006
Software Testing as a Career – Still
Viable?
Much has been made of how offshoring has robbed Silicon Valley of its high tech jobs, with new opportunities springing up in such far-flung places as India, China, and Eastern Europe. But what's the real impact today? While the SSQA membership has seen profound improvements in employment over the past year, come hear the thoughts of Mikhail Portnov of Portnov Computer School in Mountain View, CA. He experienced first hand the explosive growth of the 90's, where job needs went from a handful of openings to hundreds of open positions and then back down in the early 2000's. Yet, Mikhail remains positive about the outlook for software QA and test professionals. As the recent job market revives, his students (many of whom are well-educated and legal immigrants) and their spouses are finding QA and testing positions more quickly again. Mikhail will cover the following topics:Speaker: Mikhail Portnov (Founder, Portnov Computer School)
* Different venues for QA training today
* Different niches in the QA training market
* Specific ideas for short-term career transition for degreed students
* Realities of today's job market
* Thoughts on local QA resources versus offshore testing
March 14, 2006
Software Engineering: Facts or Fancy?
(presentation slides -- PDF)
We hear many things that we either are or should be doing to deliver the right software in a timely and cost-effective manner. Some of these approaches are very popular, some are less so. But do we really know which of these approaches actually work? And how would we know if they did? In this presentation, Brian will offer some ideas and we'll examine some of the assumptions we make about software engineering. Which things that you currently believe in are indeed fact? And which are fancy? Let's learn together.Speaker: Brian Lawrence (Principal, Coyote Valley Software)
February 14, 2006
QA Road Warriors
QA has matured, and yet in many parts of the world there is a deep need for qualified QA professionals, individuals who know how it is done right. While Silicon Valley continues to shake itself awake after its long slumber, these two professionals couldn't wait and sought opportunities and professional excitement outside the valley. Doug Hoffman joined SDT and brought his assessment and training expertise to companies in India, China, Canada, and France. Claudia Dencker brought her test leadership and management skills to an electrical utility company in Edmonton, Canada. Share in their stories of excitement, trials and tribulations of what it means to take QA expertise on the road.Speakers: Claudia Dencker (President, Software SETT Corporation)
January 10, 2006
The T in Quality
This talk will focus on the T in Quality. We'll start with the definition of QUALITY and contrast it to Quality Assurance and Quality Control. We'll also talk about the institutionalization of quality and what types of support systems need to be in place to make this happen. Finally, we'll review some case studies in implementing training to support Software Process Training.Speaker: Lew Jamison (CEO / Learning Strategist, Performance Improvement Circle)
December 13, 2005
Quality Training: What's been your experience?
The speaker will facilitate a discussion on the audience's experiences with quality-related training in the corporate environment. How useful has it been? How does one measure its effectiveness?Speaker: Lew Jamison (CEO / Learning Strategist, Performance Improvement Circle)
November 8, 2005
Estimating Software Size
Accurately projecting the size of a proposed software system remains the weakest link in the software cost estimating chain. Deriving an appropriate size estimate is neither straightforward nor trivial. Due to the lack of definitive information during the concept and design phases of software system development, size estimates made in those phases are characterized by uncertainty, generally resulting in estimates of very low credibility or validity. Even as systems mature in their final stages (with requirements stabilized, all data inputs, outputs, and interfaces identified, and all processing functions clearly defined), the process of sizing software is still subject to a wide margin of uncertainty. This presentation addresses the software sizing problem and discusses the Software Sizing Model (SSM) developed by Dr. Bozoki and in use worldwide. Also discussed will be how this model can help organizations address CMMI® model requirements regarding estimation and historical data.Speaker: Dr. George Bozoki (Founder, Target Software)
October 11, 2005
Logical
Entity/Relationship Modeling: The Definition of Truth for Data
Logical Entity/Relationship (E/R) models, also referred to as "conceptual" or "semantic" models, define the information requirements of the enterprise, independent of the resulting implementation. A well defined E/R model is the key to successful development of data oriented applications. Although most frequently associated with relational databases, the logical E/R model is equally applicable to object oriented and XML implementation. This presentation will provide an overview of the fundamentals of E/R modeling as the definition of the information requirements of the enterprise. It will focus on the underlying concepts and notations, with a strong emphasis on the semantic content of the E/R model.Speaker: Jeff Jacobs (Covad Communications, Jeffrey Jacobs and Associates)
September 13, 2005
Automation Techniques for Enterprise
Application Testing
Enterprise applications are comprised of dozens of technologies and hundreds of classes, often developed and tested by scores of dispersed teams using disparate build and test frameworks. Integrating different components of applications that are developed by different teams often results in test bases with numerous build frameworks that are inefficient and a nightmare to enhance. Through this session you will be able to:Speaker: Aditya Dada, Sun Microsystems, Sun Java System Application Server SQE Team
Proven testing strategies are presented and lessons are drawn from the Sun Java System Application Server SQE Team.
- Identify frameworks that are inflexible.
- Seamlessly introduce automation framework in test bases with multiple frameworks.
- Create a framework that allows easy changes to supported configurations.
- Create a framework that allows easy management of administration changes.
- Create a framework that allows greater coverage on multiple configurations.
August 9, 2005
A Process Driven Approach for
Effective Application Service
Quality for IT Organizations
A characteristic of most IT organizations is that new software development project controls and procedures are very rigorous while Operations and Maintenance controls and procedures are much less structured and accountable. However applications are constantly changing, driven by ever changing business requirements and problem reports from the end user. A lack of solid process can drive maintenance costs sky high and create a very chaotic environment. Keith Mangold and Q Analysts have developed a model leveraging best of breed industry standards like ITIL, CMMi, TQM and CobiT to incrementally improve application service quality for large IT enterprises. The model prescribes standardized procedures to provide the vehicle to drive focused process improvements that improve efficiency and reduce cost. This presentation focuses on software change, but is applicable to business process, data content and infrastructure changes.Speaker: Keith Mangold, Q Analysts
July 12, 2005
Early Testing Without the 'Test and Test Again' Syndrome
June 14, 2005
Sabotaging QA: a Primer
May 10, 2005
The State of Spyware
Speakers: Robert Konigsberg, Founder of Network EvaluationSpyware has come of age as computer intrusions, infections and hack attempts have increased dramatically over the past few years. Users, ranging from the novice to the IT and QA professional, have many options open to them to better protect their systems in a game where there are no winners and no end. Robert Konigsberg will present information on Spyware, where it is today and the various forms that it can take. In additional, he will cover:
- Visible identifiers of spyware
- In-depth exploration of spyware installations
- The various purposes for infection or intrusion
- Legal Aspects: What "License Agreements" imply and fail to make clear
- Origins for spyware (Where does spyware come from?)
- The role of anti-virus and anti-spyware utilities
- Coping strategies for the user
Robert is the founder of Network Evaluation, and has been involved in various aspects of computer security since 1992. He has produced magazine articles, white papers, tutorials, guides and presentations aimed at informing and educating users on various aspects of networking and network and computer security. Prior to starting Network Evaluation, he has worked for companies such as 3Com, Computer Curriculum Corporation, and Pearson Education, as well as following the Silicon Valley tradition of trying his hand in a few startups. He has earned SANS GSEC certification, and is an active member of the Center for Internet Security.
Solaris and Open Source - Current Status
Sun has announced plans to release the source code for the Solaris Operating System under an open source license, and to open the development process to external developers. This talk described these plans, the current status, and some of the challenges involved in moving a large commercial software project to an open development process.Speakers: Andy Tucker and Keith Wesolowski
Andy Tucker is a Distinguished Engineer in the Operating Platforms Group in Sun Microsystems. He has been at Sun since 1994 working on a variety of projects related to the Solaris operating system, including scheduling, multiprocessor support, inter-process communication, clustering, resource management, and server virtualization. Most recently, he was the architect and technical lead for Solaris Containers, and is helping lead the effort to make Solaris available as open source. Andy received a Ph.D. in Computer Science from Stanford University in 1994.
Keith Wesolowski is an engineer in the OpenSolaris team within Sun Microsystems' Operating Platforms Group. He joined Sun and the OpenSolaris project in 2004 and has an extensive open source background, including SPARC and MIPS Linux ports and several smaller projects.
March 8, 2005
QA and Open Source - The Good, the Bad and the Ugly
Sleepycat Software makes the Berkeley DB family of products and makes them available under a dual license model. That means that we're have both open source and proprietary licenses. Our technical team has extensive Open Source project experience and the roots of the Berkeley DB product came from the University of California, Berkeley. This presentation explored the Engineering and SQA challenges of managing an open source product in a distributed company.Speaker: Dave Segleau
David Segleau has more than 22 years of IT industry experience, oversees Sleepycat Software's engineering, quality assurance and support operations. Segleau joined Sleepycat from Visto, where he was director of Quality Assurance and led the QA partnership with Handspring for delivery of the original TreoMail product. Previously Segleau headed up customer service for Asta Networks in Seattle and was senior director of Engineering Services at Versata and senior director for quality assurance and technical support at Informix and Illustra.
Febuary 8, 2005
War Stories from the Ground Level
This February meeting was a working meeting by SSQA membership. We started out with our annual topic, War Stories at the Ground Level. We wrapped up our meeting with a brainstorming session of 2005 Topics.Speaker: Roundtable Discussion (SSQA Membership)
Speakers: Yana Mezher, Dave Weir, Dave LiebreichAs more software IT work goes offshore, some local IT professionals are evolving their jobs to address the critical need for global project managers. In January we heard from three project leads/managers who are actively managing global [offshore] teams. They shared some of their success factors, challenges and concerns as they work within the new business model directed by executive management.
Yana Mezher - test lead, Software SETT Corporation. Over the past eight years, Ms. Mezher has focused on developing online QA courseware, supporting testing projects with mentoring, team training and hands-on project management. She is currently managing a team based in Bangalore, India.
Dave Weir - consultant with Calavista managing several offshore QA projects for start-ups in Silicon Valley. Over the past 18 years Mr. Weir has worked with companies such as Keynote Systems, XUMA, KPMG/Triton Container International, Pacific Bell and Tandem Computers. He is currently managing an offshore QA partner based in Pune, India.
Dave Liebreich - QA Manager at Yahoo. Over the past 20 years, Mr. Liebreich has been involved in test management, test engineering and system administration working on a wide range of technologies and products. He is currently managing teams based in Sunnyvale and Bangalore, India.
December 14, 2004
Outsourcing in Software Engineering
Speaker: Sean NihalaniIn the field of software engineering, outsourcing various software projects< requires developing, implementing, and managing methodologies that ensure that the job gets done and produces results. This talk presents a high-level overview of outsourcing and the countries that are the big winners in this new business model. It will also discuss the overall impact and ramifications related to the software and technical professions.
Sean Nihalani, DSc, is the Director of the Engineering and Technologies Department at UCSC Extension in Silicon Valley. Dr. Nihalani has designed, presented and managed engineering, IT, and management courses at many universities and corporations. His 18 years of experience includes design, development, troubleshooting of LANs, WANs, hardware, software, network security and databases, as well as managing various engineering, IT, and financial projects.
November 9, 2004
Part 11 - Electronic Records and Electronic Signatures: Review of the
Regulation and a Discussion of Issues
Speaker: Tim Stein, PhDThis presentation provided an overview of the Part 11 regulation. Major areas of non compliance routinely found in non specifically developed Part 11 compliant software were discussed. Issues that organizations face in complying with the regulation were outline. Several compliance strategies were presented.
Tim Stein founded Business Performance Associates (BPA), a Cupertino based consulting firm, in 1994. Tim has helped over 100 clients achieve Part 11 compliance, validate systems, implement business applications, or develop compliant quality systems. Tim has recently finished a manuscript for a book titled: Computer System Risk Management and Validation Lifecycle. The book will be published by Paton Press and scheduled for release next year. He is a frequent speaker on the topics of Part 11 and software validation.
October 12, 2004
Topic: Why Performance QA is Broken and How to Fix It.
Speaker: Damien FarnhamAbstract not available.
Damien Farnham is a Senior Manager for the Sun Microsystems Solaris Performance team in Dublin, Ireland.
September 14, 2004
Topic: SNMP: A primer
Speaker: William EstradaThis presentation covered:
- The basics of SNMP
- Example wireless sessions
- Example wired sessions
William R. Estrada II has over 25 years experience as a System Programmer (Main frame and PC), Network Admin, Lab Manager, System Programming Manager, and Computer Operator. His on-the-job experience spans a wide range of operating systems (MVS, VS1, OS/2, DOS, Windows, Linux and Free BSD). His specialties are problem solving, automation, scripting, networking and last, but certainly not least, SNMP.
August 10, 2004
Topic: Staying Relevant in a Competitive Market
Speaker: Peter YarbroughThis presentation covered:
- Getting started in revitalizing your QA career
- Jumpstarting change in three areas
- The importance of soft skills
Peter Yarbrough is a QA professional with over seven years experience working on many large-scale IT projects for Fortune 500 companies. He has acted as QA lead or manager as well as QA engineer testing web applications and shrink-wrapped software products. Most recently he moved into technical support working with a small team acting as liaison between development and on-line support. As a QA professional who has had to adapt to a changing technical landscape, Peter brings a unique perspective to staying employed, staying engaged and relevant in the QA profession. He studied Engineering at Santa Clara University and holds a degree in Technical Communications from De Anza College.
July 13, 2004
Topic: Market Based Job Searching
Speaker: Merrin DonleyThis presentation covered:
- Current labor market information for job seekers
- WIA services and other free or low-cost resources for job seekers
- Hiring trends and factors for job search success
Merrin Donley is a Career Management Specialist with over twelve years experience working with diverse industries in Silicon Valley. Most recently she is working for the Silicon Valley Workforce Investment Board at Campbell One-Stop assisting recently laid-off workers in their job search.
June 8, 2004
Topic: Test Variables Impacting Wireless Applications
Speaker: Gail LowellHere is a familiar test scenario. You are logging into an account. You enter in the correct account number and password. The logon authentication fails. You try again but this time you are careful - you are absolutely certain you have entered the correct account number and password. The logon fails again.
Did you check your headset? Headset! What does a headset have to do with a logon failure? In a wireless world, accessories can sometimes cause distortion or interfere with the sending of correct tones through your telephone or other wireless device. There are many other unique factors that can impact results when testing applications designed for a wireless environment.
This overview can help you enrich your test scripts and be prepared for the surprising results you sometimes get when testing wireless applications.
Gail Lowell is the Product Manager and acting QA Manager for the Unified Communications solutions business unit of InPhonic, Inc. InPhonic is a leading provider of communication software and services. Gail has several years experience managing the introduction and release of complex software applications in the US and internationally. She has industry experience in Unified Communications, Wireless Phones, Warehouse Logistics, Semiconductor Manufacturing, Human Resources, and Insurance software applications. Gail has been working with unified communications and wireless devices for the last four years.
May 11, 2004
Topic: Security Testing
Speaker: Rhonda Farrell and Jason ReidIn these uncertain times, security has taken on a new significance. Homeland security, white collar crime, corporate espionage, and new legal regulations drive forward the increased need for security. Customers, users, and the government expect and may contractually oblige you to deliver a secure application.
What is a secure application? What makes something "secure"? Will security testing keep my companies' name unsullied? Will security testing turn my staff into a group of evil hackers? How can I plan for, execute, and validate security testing efforts?
This discussion overviews what security is and how you test it. The presentation provides examples of what to look for and why. Security at both the conceptual and technical levels is covered.
Jason Reid is a test engineer at Sun Microsystems working in the Software System Test group. He has also been an SQA engineer in the Developer Tools group. Prior to joining Sun, Jason worked at the Purdue University Computing Center as an UNIX system administrator while obtaining his BS in Computer Science.
April 13, 2004
Topic: Realistically Estimating Test Projects
Speaker: Ross CollardQuestion: When will the system testing be completed??? (Asked by an eager pest -- your boss -- with a tone of great anxiety.)
Note #1: At the time he asks this question, you do not know (a) the final scope of the functionality, (b) when the developers will deliver the final system for testing, and (c) what test resources you will have available.
Note #2: The boss wants a definitive answer and a drop-dead commitment from you in two minutes anyway.
Answer: Take a wild guess and multiply by two.
Question: What do you do when the boss cuts your agreed-on test duration by 85%??
Answer: Remind him that you thought he really understood that quality is important.
Let's face it: developing realistic and credible estimates is a critical survival skill for test professionals and managers. The word "estimate" is actually short for the saying: "Establishing Sloppy Time Intervals Makes for Agitated Test Engineers". This discussion overviews techniques which can help improve your estimating.
Ross Collard is president of Collard & Company, a consulting firm located in Manhattan, New York. His consulting assignments have included strategic planning for technology, managing large software development projects, improving software engineering practices, and software quality assurance.
Early in his career Ross was a hot-shot software engineer for Citibank in New York City. He first became interested in quality issues when he stayed up 48 hours straight trying to find a bug in his own code. During these same 48 hours the operational failure caused by his bug cost Citibank approximately 1,000 times Ross' annual salary. Fortunately for Citibank, this same loss only amounted to approximately 5 seconds worth of the bank's profits. Also fortunately for Citibank and the rest of the worldwide business community, Ross does not program much any more. But he sure does know how to test -- which is perhaps a more challenging skill than programming.
Ross Collard has conducted seminars on business and information technology topics for businesses, governments and universities, including George Washington, Harvard and New York Universities and U.C. Berkeley. He has lectured in the U.S.A., Europe, the Middle East, the Far East, South America and the South Pacific. He has a BE in Electrical Engineering from the University of Auckland, New Zealand (where he grew up), an MS in Computer Science from the California Institute of Technology and attended Stanford University's Graduate School of Business. He is writing a series of books on software testing and QA, at a gruesomely slow pace.
March 9, 2004
Topic: War Stories at the Ground Level
These short descriptions represent real events as related at this meeting:Speaker: Panel Discussion (SSQA Membership)
- Incrementing build ID's at the decimal level so the builds wouldn't seem so many: .01, .02, .03, etc.
- Dividing test cases into smaller chunks so that the volume of work would seem higher on the metric reports
- Testing for the absence of a feature without knowing how to turn it on/off
- Testing based on a schedule-driven date - why bother and not just release
- Submit a ton of bugs into the bug base to help get more time for testing or to delay the release
- Failing of tests at the same point every day due to the sun coming through the blinds and overheating the computer
- Failing of hardware due to software tests
- Exploding toner which required the tester to be rushed to the hospital due to toner in the face, in the nostrils and eyes
- Conducting extensive rework to make up for organizational shortfalls before even starting the task of testing
- Publishing the results of the BEST load test run even though every run produced a separate set of results
- Releasing a product with the limits set at test levels, not the full production limits. Several examples were cited, but the most memorable one was Excel which, in a past release, was released with a 10x10 array, not the usual 65,000 x 65,000 array.
- Testing new hardware away from all the other equipment in the lab and away from a sprinkler head so that if it is catches fire or smokes, every piece of lab equipment doesn't get destroyed by water
- Not understanding why a specification is a key input to test scope
Bio not applicable
February 10, 2004
Topic: Roll Your Own .NET Automated Tests
Speaker: Elisabeth HendricksonDespite a variety of commercial graphical user interface (GUI) test tools on the market, programmers often find themselves resorting to manual testing of their GUIs. Adopting commercial GUI-based regression tools requires developers to learn a whole new development environment and language. Furthermore, these tools are often expensive and may be overkill for what developers need. Fortunately, there is an alternative for programmers who need to test a .NET GUI: reflection. By using reflection in the .NET framework, programmers can send events to user interface elements without a separate, specialized tool. In this talk, Elisabeth Hendrickson demonstrates how to use C# with nUnit to simulate user events to test .NET applications through the GUI.
This talk is a preview of the talk Elisabeth will be giving at Software Development Conference and Expo West 2004 on March 18 in Santa Clara, CA.
Elisabeth Hendrickson is an independent consultant specializing in software quality, management, and testing. An award winning author, Elisabeth has numerous published articles and is a frequent speaker at major software quality and software management conferences. She has worked with and for leading software companies since 1988. You can reach her at esh@qualitytree.com and read more about her ideas on quality and testing at www.qualitytree.com.
January 13, 2004
Topic: A Case Study in Best Practices in Software Process
Documentation: Space Station Software Project Measurement and Analysis
The Software and Data Systems (S&DS) Team of the (International) Space Station Biological Research Project (SSBRP) has recently achieved a CMMI Maturity Level 2 rating. Key to this achievement was mastery of the Measurement and Analysis (MA) Process Area -- one that did not exist in the previous CMM. S&DS applied the Practical Software Measurement (PSM) approach to tackle the MA process area requirements. This presentation illustrates best practices in the area of Process Documentation through examples from S&DS's Measurement and Analysis process. Included are insights into Process and Plan template creation and use, tailoring, and compliance. And of course there's the measurement process itself -- which has garnered praise from the CMMI appraisers and from measurement experts at the SEI.Speaker: Rob Robason
Rob Robason is a Senior Process Engineer with Intrinsyx Technologies at NASA Ames Research Center, supporting Ames' SEPG, and the Software and Data Systems (S&DS) team of the Space Station Biological Research Project. S&DS achieved a CMMI Maturity Level-2 rating last October, and Rob was singled out and recognized for his contributions to the team's accomplishment, including his definition and implementation of the S&DS Measurement and Analysis process, critical to the CMMI rating. Rob has also improved software processes at Cisco Systems, Accugraph, and Hewlett Packard. He also has experience as a Systems Engineer and Software Development Engineer at HP, and as the Quality Manager at Accugraph. Rob earned a BS in Electronic Engineering at Cal Poly, and has done graduate work in Computer Science at Colorado State and UC Santa Cruz and in Industrial Engineering at Texas A&M. Rob can be reached at rob AT robason.net.
December 9, 2003
Topic: Capability Maturity Model for Software (CMM)
This presentation will provide an overview of the SEI Capability Maturity Model (SEI/CMM). Topics will include discussion of the purpose of the SEI/CMM, the five levels, the importance of metrics, and how the SEI/CMM framework can be used to understand and improve the quality of software development.Speaker: Jeff Jacobs
Jeff Jacobs has over 20 years experience in software development, with a focus on software development methodologies and practices. His management experience ranges from leading development for startups to overseeing multi-company efforts for communications satellite systems. He has consulted to numerous companies and has trained over 3000 students in Information Engineering and various modeling techniques. Jeff is the author of numerous papers and is a frequent presenter at technical conferences. He received his B.S. in Information and Computer Sciences from U.C. Irvine, where he was one of the co-authors of UCI LISP. Jeff is a consultant providing services in software process improvement, methodology adoption and tailoring, business/systems analysis and modeling. His web site is http://www.jeffreyjacobs.com.
Last Updated: .