SSQA Logo

Silicon Valley Software Quality Association (SSQA)

Invited Guest Speakers


Home | Speakers | Officers | Bylaws | Links | Jobs | Membership Management | Directions


SPEAKER INDEX

DATE SPEAKER(S) ORGANIZATION TOPIC
November 13, 2012 Doug Hoffman Software Quality Methods, LLC. "Self-Verifying Data"
October 9, 2012 Jon Bach eBay "Exploratory on Purpose"
September 11, 2012 Kaarthick Subramanian CSC "Mobile Application Testing – Challenges & Best Practices"
August 14, 2012 Paul Linares Crosstest, Inc. "White Box Test Automation Framework for C/C++"
SSQA Minutes of the first SSQA meeting, 8/18/1987
July 10, 2012 Elisabeth Hendrickson Quality Tree Software and Agilistry "Expeditions to the Unknown: Discovering Surprises and Risks in Software"
June 12, 2012 Tammy Davis, Ken Doran, moderators SSQA "Brainstorming!"
(Q&A regarding SSQA, future SSQA topics and speakers)
May 8, 2012 Duvan Luong Operational Excellence Networks "Operational Excellence for SQA"
April  10, 2012 Doug Hoffman Software Quality Methods, LLC. "Exploratory Test Automation"
March, 2012

None

No SSQA Meeting This Month
February 9, 2012 Brian  Lawrence Coyote Valley Software "How to use lies to get the quality you need"
January 10, 2012 Sandeep Khabiya

 

Hewlett-Packard Wry Toastmasters Club "Conquer your Speaking Fear"
Jon Regan Hewlett-Packard Wry Toastmasters Club "Powerful Speaking Without Preparation"
November 8, 2011 Karen Burley Hewlett-Packard "Can QA Innovate?"
Mary Ann May-Pumphrey Adobe EchoSign "Converting from Selenium/Perl to Selenium/Python with the Page Object Model: Four Very Useful Aids"
Doug Hoffman Software Quality Methods, LLC "Self-Verifying Data"
October 11, 2011 Mary Ann May-Pumphrey, moderator SSQA! "SSQA in 2012!"
September 13, 2011  Don Miller Product Life Cycle Process Architect, PayPal Quality Before Code: The Beginning of the Product Life Cycle
Karen Burley  Engineering Section Manager, HP A Case Study in Agile QA
August, 2011

None

No SSQA Meeting This Month
July 12, 2011 Jeff Richardson Chief Transformational Engineer Empowered Alliances
June 14, 2011
Hemant Gaidhani
Senior Technical Marketing Manager, VMware
Performance & Scalability Testing Virtual Environment
May 10, 2011
Jane Fraser
QA Director, Pogo
To Fix or Not to Fix - that is the Question

Adrienne Hunter
QA Manager, PACE Anti-Piracy
Is Sikuli a viable QA tool?

Susan McVey
Software Quality Engineer, IBM Silicon Valley Lab
Starting a Geographically Distributed Test Team

Tim Stein
CEO / President, Business Performance Associates, Inc.
Verification in the Development of Medical Device Software Per IEC 62304

Forest Weld
Director of QA & Support, Arxan Technologies
Test/Support Synergy

Malvika Agrawal
Software Quality Engineer, IBM
Book review of: "Implementing Automated Software Testing: How to Save Time and Lower Costs While Raising Quality"
April 12, 2011
Adam Christian
JavaScript Architect, Sauce Labs
Windmill - The Selenium Oppugner
March 8, 2011
Henry Cate
QA Engineer, Teradata Corp.
Monkey: Tool for Generating Random SQL Test Cases
February 8, 2011
Jon Bach
Director of Quality Engineering, eBay's Search and Discovery team
Applying Creative Thinking to Quality and Testing Problems
January 11, 2011
Nirmala Anisetti
Quality Engineering Manager, Yahoo!
Security Testing: Paranoid Approach
December 14, 2010
None
No SSQA meeting this month
Happy Holidays, See you in 2011!
November 9, 2010
None
No SSQA Meeting This Month

October 12, 2010
Doug Hoffman
Software Quality Methods LLC, ASQ Fellow
Why Tests Don't Pass
September 14, 2010
Sandeep Bhatia
Sr. Development Manager for Quality, Intuit
How Agile Takes Care of Quality
August 10, 2010
None
Meeting cancelled due to speaker illness

July 13, 2010
Nixon Augustin
Software Engineer, Brocade Communications Systems
Introduction to TestLink
June 8, 2010
Alex Pineda
Sr Software Development Manager, Oracle Corp.
The Myths and Pitfalls of QA
May 11, 2010
Sunita Vaswani
Quality Engineer, IBM Rational
Effective Application of Software Test Automation at IBM Rational
April 13, 2010
Ramesh Mandava
Engineering Manager, eBay
Pushing Quality Upstream at eBay
March 9, 2010
Jeff Richardson
Chief Transformational Officer, Empowered Alliances
The Secrets of Successful Networking - Expanding Your Professional Network
February 9, 2010
Susan McVey
Software Quality Engineer, IBM Rational
Software Testing for the Long Term
January 12, 2010
None
SSQA meeting cancelled due to scheduling conflict

December 8, 2009
None
No SSQA meeting this month
Happy Holidays, See you in 2010!
November 10, 2009
Nixon Augustin
Software Engineer, Brocade Communications
Tutorial on Using STAF (open source test automation framework)
October 13, 2009
Rutesh Shah
Founder/CEO, InfoStretch Corp.
Enterprise 2.0 and Testing Challenges
September 8, 2009
Tim Riley
Murali Nandigama
Director of QA, Mozilla Corp.
Consultant
Effective Gap-Centric Test Development Strategies Using Code Coverage and Test Case Mapping
August 11, 2009
Jagadesh Munta
Sr. Software Engineer, Sun Microsystems
Taking Quality to Developer Desktop: Java Static Analysis with FindBugs
July 14, 2009
Ken Doran
Administrative Systems, Stanford University
Building Your Software QA Library
June 9, 2009
Jonathan Lindo
Co-founder/CEO, Replay Solutions Inc.
TiVo(tm) for Software, the future is now!
May 12, 2009
T Ashok
Founder/CEO, STAG Software (Bangalore India)
Hypothesis-Based Testing
April 14, 2009
Nixon Augustin
Software Engineer, Brocade Communications
Multi-Client Testing Using STAF (open source, test automation framework)
March 10, 2009
Mary Ann May-Pumphrey
Software QA Engineer and DeAnza College Instructor
Automated Web Page Testing with Selenium IDE:  A Tutorial
February 10, 2009
Russell Pannone
Agile Product Development Practitioner and Coach
Quality Assurance in the World of Agile Systems / Software Development
January 13, 2009
None
SSQA meeting cancelled due to scheduling conflict

December 9, 2008
None
No SSQA meeting this month
Happy Holidays, See you in 2009!
November 18, 2008
Satya Dodda
Director of Software Quality Engineering, Sun Microsystems
Test-Driven Development Best Practices
October 14, 2008
John Green
Sr. Staff Engineer, VMware
Success with Test Automation
September 9, 2008
Yashwant Shitoot
Consultant
Beyond Testing - Achieving Software Excellence
August 12, 2008
Sriram Lakkaraju
QA Manager, Sun Microsystems
How To Ensure Enterprise Software Is Highly Available
July 8, 2008
Jim Singh
Director of Technology, VMLogix
Test Lab Virtualization
June 10, 2008
Sachin Bansal
Senior Quality Engineering Manager, Adobe Systems
Total Automation!
May 13, 2008
Samir Shah
Founder/CEO, Zephyr
Enterprise 2.0 is here - Upgrade your Test Department!
April 8, 2008
Tim Riley
Director of Quality Assurance, Mozilla Corporation
The SQA Approach on the Mozilla Project - How Firefox gets Tested
March 11, 2008
Madhava Avvari
QA Manager, Ad Serving Systems, Yahoo!
Testing a Cool Internet Technology called "Ad Serving Systems"
February 12, 2008
Peter Jensen
Software Architect, Sun Microsystems
Clichés, Metrics, and Methods:  A Discussion of the Quality System and its Role in Contemporary Software Development
January 8, 2008
Doug Hoffman
Consultant, Software Quality Methods LLC
Exploring an Expanded Model for Software Under Test
December 11, 2007
None
No SSQA meeting this month
Happy Holidays, See you in 2008!
November 13, 2007
Murali Nandigama
Senior Development Manager, Oracle Corporation
Security in the Software Development Life Cycle
October 16, 2007
Kowsik Guruswamy
Co-founder and CTO, Mu Security
Life is not static, so why are your test cases?
September 11, 2007
Doug Hoffman
Software QA Program Manager, Hewlett-Packard
A Graphical Display of Testing Status for Complex Configurations
August 14, 2007
Tim Riley
Director of Quality Assurance, Mozilla Corporation
Testing in the World of Open Source Software
July 10, 2007
Gopal Jorapur
Engineer, Sun Microsystems
Closing the Loop On Quality - Integrating Customer Feedback
June 12, 2007
Cédric Beust
Senior Software Engineer, Google
Next Generation Testing with TestNG
May 8, 2007
Larry Steinhaus
Sujoy Ghosh
Program Manager, NonStop Div., Hewlett-Packard
Program Manager, NonStop Div., Hewlett-Packard
Developing and Using a Defect Removal Model to Predict Customer Experiences on Software Products
April 10, 2007
Brian Lawrence
Principal, Coyote Valley Software
The Software Project as a Journey
March 13, 2007
Anita Wotiz
Program Coordinator, Software Engineering, UCSC Extension
Requirements Management, an Integral part of Quality Release
February 13, 2007
Sachin Bansal
Senior Quality Engineering Manager, Adobe Systems
How to Design Regression Test Automation Frameworks for System Testing
January 9, 2007
David Roland
Sr. Computer Scientist, Computer Sciences Corp., NASA Ames Research Center
Keeping Score - How to Know When You're Done
December 12, 2006
Tim Stein, Jason Reid, Alka Jarvis, James Cunningham
Local Authors
Book Signing and Holiday Festivities with Local Authors
November 14, 2006
Doug Hoffman
Program Manager, Hewlett-Packard
Test Automation Beyond Regression
October 10, 2006
Hans Buwalda
Chief Technology Officer, LogiGear Corp.
The 5 Percent Rules of Test Automation
September 12, 2006
Mark Himelstein
President, Heavenstone Inc.
Who is Responsible for Quality?
August 8, 2006
Duy Bao Vo
Graduate Student, San Jose State University
Quality-Driven Build Scripts for Java Applications
July 11, 2006
David Hsiao
Director, Metrics Strategy and Benchmarking COE, Cisco Systems
Metrics, Benchmarking and Predictive Modeling
June 13, 2006
Duvan Luong
Technical Director for Enterprise Quality, Cadence Design Systems
Fighting the BUG WAR with Inspections and Reviews: A Success Story
May 9, 2006
Lisa K. Arnold
 
Anu Ranganath
Customer Satisfaction Analyst, Cisco Systems
Voice-of-the-Customer Program Manager, Cisco Systems
Using Data-driven Analysis to Increase Customer Satisfaction
April 11, 2006
Mikhail Portnov
Founder, Portnov Computer School
Software Testing as a Career – Still Viable?
March 14, 2006
Brian Lawrence
Principal, Coyote Valley Software
Software Engineering: Facts or Fancy?
February 14, 2006
Claudia Dencker
 
Doug Hoffman
President, Software SETT Corporation
President and Principal Consultant, Software Quality Methods, LLC
QA Road Warriors
January 10, 2006
Lew Jamison
CEO / Learning Strategist, Performance Improvement Circle
The T in Quality
December 13, 2005
Lew Jamison
CEO / Learning Strategist, Performance Improvement Circle
Quality Training: What's been your experience?
November 8, 2005
Dr. George Bozoki
Founder, Target Software
Estimating Software Size
October 11, 2005
Jeff Jacobs
Covad Communications, Jeffrey Jacobs and Associates
Logical Entity/Relationship Modeling: The Definition of Truth for Data
September 13, 2005
Aditya Dada
Sun Microsystems
Automation Techniques for Enterprise Application Testing
August 9, 2005
Keith Mangold
QAnalysts
A Process Driven Approach for Effective Application Service Quality for IT Organizations
July 12, 2005
Doug Hoffman
SDT Corporation
Early Testing Without the "Test and Test Again" Syndrome
June 14, 2005
Jason Reid
Sun Microsystems
Sabotaging QA: a Primer
May 10, 2005
Robert Konigsberg
Network Evaluation
The State of Spyware
April 12, 2005
Andy Tucker and Keith Wesolowski
Sun Microsystems
Solaris and Open Source - Current Status
March 8, 2005
Dave Segleau
Sleepycat Software
QA and Open Source - The Good, the Bad and the Ugly
February 8, 2005
Roundtable Discussion
SSQA Membership
War Stories from the Ground Level
January 11, 2005
Yana Mezher
Dave Weir
Dave Liebreich
Software SETT Corp.
Calavista
Yahoo
Tips for Managing an Offshore Team from Three Who Know
December 14, 2004
Sean Nihalani
UC Santa Cruz Extension
Outsourcing in Software Engineering
November 09, 2004
Tim Stein
Business Performance Associates
Part 11 - Electronic Records and Electronic Signatures: Review of the Regulation and a Discussion of Issues
October 12, 2004
Damien Farnham
Senior Manager, Solaris Performance, Sun Microsystems
Why Performance QA is Broken and How to Fix it
September 14, 2004
William Estrada
Mt Umunhum Wireless
SNMP: A primer
August 10, 2004
Peter Yarbrough
Software SETT Corp.
Staying Relevant in a Competitive Market
July 13, 2004
Merrin Donley
Silicon Valley Workforce Investment Board
Market Based Job Searching
June 8, 2004
Gail Lowell
InPhonic
Test Variables Impacting Wireless Applications
May 11, 2004
Rhonda Farrell and Jason Reid
Self and Sun Microsystems
Security Testing
April 13, 2004
Ross Collard
Collard & Company
Realistically Estimating Test Projects
March 9, 2004
Panel Discussion
SSQA Membership
War Stories at the Ground Level
February 10, 2004
Elisabeth Hendrickson
Quality Tree Software
Roll Your Own .NET Automated Tests
January 13, 2004
Rob Robason
Intrinsyx Technologies
A Case Study in Best Practices in Software Process Documentation: Space Station Software Project Measurement and Analysis
December 9, 2003
Jeff Jacobs
Jeffrey Jacobs & Associates
Capability Maturity Model for Software (CMM)


SPEAKER DETAILS

November 13, 2012

Self-Verifying Data

Some tests require large data sets. The data may be database records, financial information, communications data packets, or a host of others. The data may be used directly as input for a test or it may be pre-populated data as background records. Self-verifying data (SVD) is a powerful approach to generating large volumes of information in a way that can be checked for integrity. The presentation describes three methods for generating SVD, two of which can be used easily. Topics include:

Speaker: Doug Hoffman, (President, Software Quality Methods, LLC.)

Doug has more than thirty years of experience with software engineering and quality assurance. Today he teaches and does management consulting in strategic and tactical planning for software quality. Training specialties include context-driven software testing, test oracles, and test automation design. His technical specialties include test oracles, test planning, automation planning, and developing test architectures. Management specialties include ROI based planning of software engineering mechanisms, QA management, organizational assessment, evaluating and planning for organizational change, managing and creating project management offices, building new quality organizations, and transforming existing quality assurance and testing groups to fulfill corporate visions.

He has been the Chair of the Silicon Valley section of the American Society for Quality (ASQ) and is the current Chair the Silicon Valley Software Quality Association (SSQA).  He's a founding member and past Member of the Board of Directors of the Association for Software Testing (AST), as well as a member of ACM and IEEE, and a Fellow of ASQ.

----------------------------------------

October 9, 2012

Exploratory on Purpose

Contrary to what you might think, exploratory testing has structure, much in the same way jazz has structure or a conversation with a friend has structure.  It may not seem like it, but once you know what to look for, you won’t be able to ignore it.  This presentation is about simple methods, tactics, and tools you can use to leverage the structures in exploration to quickly find important bugs, while responding to any scrutiny about what you did to find them.

“If we are alert, with minds and eyes open, we will see meaning in the commonplace; we will see very real purposes in situations which we might otherwise shrug off and call ‘chance’.” -- from a lecture Jon’s grandfather Roland Bach. 

Speaker: Jon Bach (Director, Live Site Quality, eBay)

Jon's role at eBay is to coordinate efforts to find production bugs on eBay's related sites and take measures to improve customer experiences through bug advocacy. A veteran of more than a dozen STAR conferences, he delights in telling stories from 17 years in testing for companies like LexisNexis, Hewlett-Packard, and Microsoft.  He is an award-winning keynote speaker and blogger, but his claim-to-fame is as co-creator of Session-Based Test Management —a way to manage and measure efforts from exploratory testing.

----------------------------------------

September 11, 2012

Mobile Application Testing – Challenges & Best Practices

As mobile adoption continues to grow, from enterprise applications to consumer applications, companies recognize the potential to boost revenue, decrease costs and reach out efficiently to their customer. However, the mobile market is becoming increasingly competitive and complex. The huge diversity of devices, operating systems, OS versions, carriers etc. makes it virtually impossible to sustain reasonable quality standards for their mobile applications and websites across platforms. The complexity starts with multitude of mobile devices having different screen sizes, hardware configurations and image rending capabilities. In addition, the proliferation of operating platforms and the hundreds of mobile phone carriers worldwide working on diverse local network standards (GSM, CDMA etc.) is putting further strain on development teams. Hence, testing becomes more complicated and mobile QA is becoming increasingly expensive. This presentation addresses mobile QA challenges, such as minimizing the cost of mobile QA without compromising on application quality, mobile QA test infrastructure, and best practices in the area.

Speaker: Kaarthick Subramanian (Regional Practice Leader, Global Applications, CSC)

Kaarthick Subramanian currently serves in the Independent Testing Services organization at CSC, where he leads the Banking, Financial Services and Insurance (BFSI) testing practice. Earlier, at Polaris Software Lab Ltd (financial technology company), Kaarthick served as VP and Head of Testing Engagements (Strategic Accounts). Kaarthick's engagement with Fortune 500 companies has involved deploying strategies, techniques and tools around test management, functional testing, and the automation of enterprise applications. Kaarthick has helped organizations like Polaris and Lionbridge (and their clientele) build efficient practices in the QA and Test Management space. You may reach Kaarthick at ksubramani30@csc.com.

----------------------------------------

August 14, 2012

White Box Test Automation Framework for C/C++

White box testing can be time consuming and potentially reduce product quality if done in the dark. Integrating test design, setup, execution, and analysis in an integrated solution can reduce the test time and increase quality by facilitating and visualizing code and tests throughout development and debugging. The talk will present a white box test framework that tackles the problem of insufficient early stage testing. A demonstration of the framework will be part of the presentation.

Speaker: Paul Linares (VP, Customer Solutions, Crosstest, Inc.)

Paul Linares serves as VP Customer Solutions at CrossTest. Earlier, at Atempo, a data protection and backup software solution provider, Paul was in charge of customer satisfaction as world-wide customer support Vice President. Previously, Paul was Vice President of Operations at NetCentrex, a voice over IP leader, and held Vice President of Engineering positions in several startups (on-line training and on-line banking). Prior to that, he has helped the rapid expansion of CETIA, a subsidiary of Thales, a 12 billion Euros electronics, defense and aerospace corporation. Paul earned a BSEE from Ecole Nationale de l'Aéronautique et de l'Espace and a MS from the California Institute of Technology where he specialized in Computer Science and Electrical Engineering. 

----------------------------------------

This meeting was the 25th anniversary of the first meeting of what would become SSQA-SV. Doug Hoffman provided a copy of the minutes from the first meeting for the "Software Quality Assurance 'Creative Force' ".


July 10, 2012

Expeditions to the Unknown: Discovering Surprises and Risks in Software

(Note: the above link is to a very large .pdf file on the web. Hopefully we will have a smaller, local copy for easier download shortly.)

Exploratory testing involves learning about the software while simultaneously designing and executing tests, using feedback from the last test to inform the next. It's an approach that reveals risks and vulnerabilities no one thought about or could even have predicted in advance. When combined with automated unit and system level tests, it leads to high quality software with significantly fewer post-deploy surprises. In this session, you'll discover how you can systematically explore to discover surprises at all levels, from GUIs to code.

Speaker: Elisabeth Hendrickson (Founder, Quality Tree Software and Agilistry)

Elisabeth Hendrickson wrote her first line of code in 1980 on a TRS Model I and has been hooked on software development ever since. She's the founder and president of Quality Tree Software, Inc., a consulting and training company dedicated to helping software teams deliver working solutions consistently and sustainably. She also founded Agilistry Studio, a practice space for Agile software development in Pleasanton, CA. She served on the board of directors of the Agile Alliance from 2006 - 2007 and is one of the co-organizers of the Agile Alliance Functional Testing Tools program. Elisabeth splits her time between teaching, speaking, writing, and working on Agile teams with test-infected programmers who value her obsession with testing. You can find her on Twitter as @testobsessed and at http://www.qualitytree.com.


June 12, 2012 - SSQA brainstorming meeting 12, 2012

This meeting was a discussion led/moderated by Tammy Davis Mary Ann, generally a brainstorm for group meeting topics and discussion of various ideas regarding the group, it's purpose, and ideas for the future. Here are notes from the meeting.


May 8, 2012

Operational Excellence for SQA

Speaker:  Duvan Luong (President, Software Quality Methods, LLC.)

Duvan Luong received his PhD in Information and Computing Science from Lehigh University in Pennsylvanioa. Duvan has 30 years of practicing operational excellence at IBM, Synopsys, Sun, HP, and Cadence, both as an individual technical contributor and in management. Drawing on his lifelong experiences in organization and business operational improvement, Duvan has authored the operational excellence methodology and framework, providing implementation guidelines and practices for use by those wishing to implement operational excellence in their companies. Further, Duvan has founded the Operational Excellence Networks to assist companies in their implementation efforts. (Contact Duvan at: operationalexcellencenetworks.com@gmail.com.)


April 10, 2012

Exploratory Test Automation

Speaker:  Doug Hoffman (President, Software Quality Methods, LLC.)

Doug Hoffman is a management consultant in testing/QA strategy and tactics.


February  9, 2012

How to use lies to get the quality you need

People and organizations sometimes use lies to get the level of quality that they deliver. Beyond the "standard" lying techniques, one could categorize these lies as in paradigms, of logic, and with numbers. We'll examine these, and perhaps learn to use some of them to get the quality we need. This subject applies beyond software quality.

Speaker:  Brian Lawrence (Principal, Coyote Valley Software)

Brian Lawrence is a software consultant with a long history in the computing industry. His primary focus is teaching and facilitating requirements activities, as well as inspection, project planning, risk management, life cycles, and design specification techniques. 
Brian has served as the editor of what is now Better Software Magazine, sat on the editorial board of IEEE Software for 5 years, and has chaired conferences. He taught software engineering at the University of California Santa Cruz Extension for over a decade. Contact: brian@coyotevalley.com.


January 10, 2012 - "Conquer Your Fear of Public Speaking"

This month's speakers from HP's Wry Toastmasters Club gave 2 presentations (see below) and involved the attendees in practice exercises geared towards improving one's own public speaking as well as overcoming all of those mental or emotional blocks that seem to be in the way.

 For more information on Toastmasters, visit these sites:

    http://Wry.freetoasthost.us/
    http://www.toastmasters.org/

  1. Conquer your Speaking Fear
    Speaker:  Sandeep Khabiya (HP, HP Wry Toastmasters Club)
  2. Powerful Speaking Without Preparation
    Speaker:  John Regan (HP, HP Wry Toastmasters Club)

November 8, 2011 - "Lightning Talks!"

The meeting consisted of 3 Lightning Talks where each speaker had 5 minutes for a mini-presentation (talk or review), and then another 5 minutes for Q&A.

  1. Can QA Innovate?
    Speaker:  Karen Burley (Engineering Section Manager, HP)

    This is the Lightning Talk Karen gave at the recent PNSQC.

    Bio: Karen Burley is an Engineering Section Manager at HP Software, managing several QA teams working on Enterprise Archiving products. Karen has over 25 years of software development and management experience ranging from real-time embedded microprocessor-based products to mission-critical data center system software to Software-as-a-Service, with a strong focus on process improvement and test automation. Karen has helped developed numerous products with multi-national teams in the US, India, Slovenia, Japan, and China, with teams that ranged from 2 to over 200. Karen has a BS degree in Computer Science from the University of Illinois with graduate work at Northwestern University. Creating high-quality products that contribute significantly to the company’s revenue and growth, that provide a great customer experience and yield high customer satisfaction, is Karen's passion.

  2. Converting from Selenium/Perl to Selenium/Python with the Page Object Model: Four Very Useful Aids
    Speaker:  Mary Ann May-Pumphrey (Automation QA Engineer, Adobe EchoSign)


    This talk covered two PushToTest-sponsored webinars, one Python textbook--The Quick Python Book (second edition), the Python Module of the Week blog, and a new open-source Selenium framework--PYSAUNTER--to improve the quality of both Selenium automation tests and more generally, the presenter's Selenium automation skills.
  3. Self-Verifying Data
    Speaker: Doug Hoffman ( Software Quality Methods, LLC.)

    Whether we are doing manual or automated testing, we tell how well the software under test behaves using some kind of oracle.  Sometimes it is not obvious what the outcome should be. For example, we can generate large volumes of test data using programs and scripts. One problem with that generated data is confirming that it remains intact after we run the test exercise. One approach to address the oracle problem is to embed the answer in the test data itself - called self-verifying data. The talk presents some ways to view, generate, and use self-verifying data oracles.

October 11, 2011 - "SSQA in 2012!"

This meeting was a discussion led/moderated by Mary Ann May-Pumphrey, the SSQA Programs Director. It had an agenda of:

These are the resulting notes captured during the meeting.

 

September 13, 2011 - "2-Up"

  1. Quality Before Code: The Beginning of the Product Life Cycle
    Speaker:  Don Miller (Product Life Cycle Process Architect, PayPal)
  2.  

    The software product life cycle begins from the moment an idea is considered.  The path to quality is chartered long before the design, code, test, and release activities that we traditionally focus on.  This talk key decisions in the first steps of the life cycle were considered to understand how quality can be determined before the first line of code is written.

    Bio: Don Miller designs, deploys, and manages processes and tools for systematic innovation.  As the Product Life Cycle Process Architect for PayPal, he is responsible for the development and continuous improvement of the common framework that PayPal uses to deliver simple and secure online and mobile payment solutions.  Don has 20 years of experience leading development, QA, process, and infrastructure efforts at Silicon Valley high-tech companies including Sun Microsystems, eBay, and Intuit.  Recently, his work at PayPal has exposed him to the front end of the life cycle.  Don earned a BS in Computer Engineering from the University of Illinois and an MBA from San Diego State University.

  3. A Case Study in Agile QA
    Speaker:  Karen Burley (Engineering Section Manager, HP)

    Having observed first-hand the many issues that can arise from following the Waterfall Model for software development, Karen set out in 2004 to find a better way. The better way was Agile, including Agile QA. Agile, unfortunately, was not well thought out in terms of QA and testing and hence the ongoing search for ways to incorporate QA in Scrum sprints to maximize QA and Development effectiveness. This talk covered lessons learned and tips for success, as well as touched on some tools that could be helpful in managing the Agile QA process.

    Bio: Karen Burley is an Engineering Section Manager at HP Software, managing several QA teams working on Enterprise Archiving products. Karen has over 25 years of software development and management experience ranging from real-time embedded microprocessor-based products to mission-critical data center system software to Software-as-a-Service, with a strong focus on process improvement and test automation. Karen has helped developed numerous products with multi-national teams in the US, India, Slovenia, Japan, and China, with teams that ranged from 2 to over 200. Karen has a BS degree in Computer Science from the University of Illinois with graduate work at Northwestern University. Creating high-quality products that contribute significantly to the company’s revenue and growth, that provide a great customer experience and yield high customer satisfaction, is Karen's passion.

     


July 12, 2011

The Team Effectiveness "Learning Lab" (Transforming Personal  & Team Effectiveness)

This will be an interactive exploration of what drives teams crazy (and what to do about it). Every team has the capability to achieve extraordinary performance results. So why do so few ever realize this potential? We know why. There will be an introduction to some of the fundamental challenges associated with how teams make decisions and how they can be overcome. We'll examine your brain through the lens of neuroscience research to understand its strengths and limitations, then relate it back to specific challenges pertaining to creativity, roles, change, team processes and trust.

Speaker:  Jeff Richardson (Chief Transformational Engineer, Empowered Alliances)

Jeff Richardson has 13 years of experience developing project leaders at bay area universities, Fortune 50 companies and high-tech startups. Jeff's engineering and OD background combined with an expertise using experiential team building activities makes his programs highly engaging and technically relevant. As an educator, Jeff was one of the lead designers for Stanford's Advanced Project Management Program, in addition to designing/teaching project leadership programs at San Jose State and UC Santa Cruz - Extension. Jeff wrote the book on project team startup at a Fortune 50 company in addition to consulting with hundreds of technical teams and teaching leadership skills for project managers at companies like Cisco, Intuit, Texas Instruments, the City of San Jose and Santa Clara County to name a few. Mr. Richardson has a BS in Mechanical Engineering and a MS in OD & Change Management.


June 14, 2011

Performance & Scalability Testing Virtual Environment

Virtualization is becoming the new paradigm for application deployment in data centers. This talk discusses how virtualization impacts performance, and in turn affects performance & scalability testing. The talk will list the common pitfalls and best practice recommendations when performance and scalability testing in virtual environment.

Speaker:  Hemant Gaidhani (Senior Technical Marketing Manager, VMware)

Hemant Gaidhani is a Senior Technical Marketing Manager at VMware, Inc. responsible for evangelizing VMware's virtualization management products. Hemant is co-author of the book "Virtualizing Microsoft Tier 1 Applications with VMware vSphere 4" and has been instrumental in developing best practices for several multi-tier enterprise applications in VMware environments. Hemant is a regular speaker at several VMworld, EMCworld, InterOp and other industry trade conferences.




May 10, 2011 - "Software QA Lightning Talks"

  1. To Fix or Not to Fix - that is the Question
    Speaker:  Jane Fraser (QA Director, Pogo)

  2. Is Sikuli a viable QA tool? (ZIP)
    Speaker:  Adrienne Hunter (QA Manager, PACE Anti-Piracy)

  3. Starting a Geographically Distributed Test Team
    Speaker:  Susan McVey (Software Quality Engineer, IBM Silicon Valley Lab)

  4. Verification in the Development of Medical Device Software Per IEC 62304
    Speaker:  Tim Stein (CEO / President, Business Performance Associates, Inc.)

  5. Test/Support Synergy
    Speaker:  Forest Weld (Director of QA & Support, Arxan Technologies)

  6. Book review of: "Implementing Automated Software Testing: How to Save Time and Lower Costs While Raising Quality"
    Speaker:  Malvika Agrawal (Software Quality Engineer, IBM)



April 12, 2011

Windmill - The Selenium Oppugner

Get a different perspective on functional web testing!  Windmill is a web testing tool designed to let you painlessly automate and debug your web applications.  Windmill took a different route than Selenium, and you might find it refreshing.  Windmill seeks to make test writing easier, portable, and sustainable.  Gain some insight into the evolution and future of automated testing tools!

Speaker:  Adam Christian (JavaScript Architect, Sauce Labs)

ADAM CHRISTIAN is co-creator of Windmill and various other open source projects, including Mozmill (test automation framework for Gecko-based apps like Firefox and Thunderbird) and Jellyfish (JavaScript execution framework).  His blog is at www.adamchristian.com. Adam is currently employed as a JavaScript Architect at Sauce Labs.




June 9, 2009

TiVo(tm) for Software, the future is now!

The concept of 'TiVo for Software' has been described by some as a potential Holy Grail for application teams.  Software record/replay systems have existed in various forms for several years.  All of these are designed to give a deeper insight into the inner workings of your applications, while at the same time allowing teams to spend less time setting up and reproducing the original conditions your software was running in.  In 2009, this technology has finally reached the point where it can be broadly deployed across the entire software lifecycle, from development, QA, staging, and in production with live customers.

In this talk, we will walk through the evolution of record/replay systems, look at what's now currently available, and examine the types of problems that are being solved today with this technology.  We will also explore how record/replay technology is dramatically changing the way that software problems are solved, changes are deployed, and data centers are managed.

Speaker:  Jonathan Lindo (Co-founder/CEO, Replay Solutions Inc.)

Jonathan Lindo is co-founder/CEO of Replay Solutions, a pioneer in source code-level record/replay technologies.  Jonathan is passionate about improving the way software is built and delivered.  His research has resulted in multiple patent filings, including a patent issued for the recording/replaying of computer programs.  Jonathan frequently speaks on software lifecycle best practices and enjoys engaging audiences to uncover new ideas for using technology to improve the lives of software teams.  He studied Electrical Engineering at Carleton University before being pulled into the software industry.  He has over 15 years experience as an engineer and entrepreneur.


May 12, 2009

Hypothesis-Based Testing

The field of software testing is littered with jargon, process models, and tools.  Although there's been significant progress in this field in recent years, we find it difficult to provide a logical means to get closer to perfection.  A typical approach to testing based on the activity-based model consists of strategizing, planning, designing, automating, executing, and managing.  Over the years, we've moved from completing these activities in one go, into an agile version consisting of these activities done in short increments.  Yet the notion of "guarantee" seems elusive.

In this talk, the intent is to examine a different approach that can guarantee the quality of software.  "Guarantee" here implies that the deployed software will not cause business loss.  It's generally understood that testing is a process of uncovering defects that's accomplished via a good mix of techniques, tools and people skills.  To make guarantees, it's imperative that the approach to evaluation be sharply goal-focused.  Goal-focused evaluation means that we should have clarity as to what potential defects we need to go after.  Once the potential defects are discerned by employing a scientific approach, it's possible to arrive at an effective validation strategy, a complete set of test cases, better measures of cleanliness (quality), and appropriate tooling.

Hypothesis-based testing is built on the core theme of hypothesizing potential defects and then scientifically constructing a test strategy and test cases, measures, and tooling.  Hypothesis-based testing is powered by STEM 2.0 (STAG Test Engineering Method), STAG's defect detection technology and has been adopted by various customers over the last 8 years.  The business benefits derived by applying STEM are a reduction in development/test effort, lower software support costs, and accelerated development.

Speaker:  T Ashok (Founder/CEO, STAG Software, Bangalore India)

T Ashok is Founder/CEO of STAG Software Private Ltd (Bangalore, India), a company specializing in inventing test technologies and providing boutique test solutions to deliver clean software.  Passionate about excellence, Ashok's mission is discovering methods to build clean software.  He's enthusiastic about sharing his knowledge and thus conducts workshops and speaks at key forums.  At STAG, he's immersed in the science and engineering of testing to develop methods and tools for engineering effective tests.  He's an alumnus of the Illinois Institute of Technology (Chicago) and Anna University (India) and has over 23 years of industry experience.


April 14, 2009

Multi-Client Testing Using STAF

The Software Testing Automation Framework (STAF) is an open source, multi-platform, multi-language framework designed around the idea of reusable components or services (such as process invocation, resource management, logging, and monitoring).  STAF removes the tedium of building an automation infrastructure, thus enabling one to instead focus on building the automation solution.  This presentation will discuss test automation challenges faced at Brocade's Files business unit and how using STAF helped Files to achieve its automation goals.

Speaker:  Nixon Augustin (Software Engineer, Brocade Communications)

Nixon Augustin is a software engineer with Brocade Communications Systems.  He began at Brocade as a Hardware / Firmware Development Engineer and over the time moved into test automation.  Prior to Brocade, he worked at Cisco, Maranti Networks, and Hewlett-Packard.  Nixon's interest is in providing simple, flexible, innovative and low cost automation solutions to complex test scenarios and in applying best practices learned from the past to make life easier for test suite end-users and developers.


March 10, 2009

Automated Web Page Testing with Selenium IDE:  A Tutorial   (presentation slides -- PDF)

Increasing global competition and a poor economy worldwide are forcing companies to look even harder at open-source solutions.  Selenium is an increasingly popular choice for the automation of web page testing.  This tutorial is a 75-minute condensation of Mary Ann's 30-hour class on Selenium IDE, which she teaches at De Anza College, Santa Clara Adult Education's High Tech Academy, and the Portnov QA School.  Don't expect a "Selenium Big Picture" talk or a "Selenium Rah-Rah" talk!  There are a number of Selenium projects currently.  However, Mary Ann firmly believes "one must walk before one can run", which in Selenium terms means that one must learn the IDE well before one can move on to RC (Remote Control).

Speaker:  Mary Ann May-Pumphrey (Software QA Engineer and DeAnza College Instructor)

Mary Ann May-Pumphrey has many years of experience as a software QA engineer at Silicon Valley high-tech companies including Sun Microsystems, Yahoo!, and a search engine start-up.  She's also worked for shorter stints as a tech support engineer, tech writer, accessibility program manager, tech editor, and developer of online supplements for textbooks.  For many years, Mary Ann has also served as a part-time faculty member in De Anza College's CIS Department, where she has taught courses in QA, Perl, UNIX/Linux, Shell Programming, HTML, JavaScript, and Intro to Computers.  She initially developed her Selenium IDE course at De Anza in order to provide her students with automation training without breaking the CIS Department's budget!


February 10, 2009

Quality Assurance in the World of Agile Software Development   (presentation slides -- PDF)

Available evidence shows that being agile has its rewards, but there are challenges your organization and teams will face when adopting agile product development.  This presentation will bust the myth that there's no place for Quality Assurance in being agile.  You'll learn how quality improvement is built into being agile as a result of interactive and incremental product development, daily stand-ups, sprint reviews, and retrospectives.  This presentation will lay the foundation or building blocks that will enable you to gain a common understanding of what it means to be agile and to apply creative agile thinking to your systems / software development projects.

Speaker:  Russell Pannone (Agile Product Development Practitioner and Coach)

Russell Pannone is an agile product development practitioner and coach.  He's worked in the systems / software development industry for over 25 years in a variety of roles including: agile coach, agile practitioner, scrum master, developer, team leader, data modeler, and project manager.  Russell has led agile projects and worked with clients in a number of industries, including state and local government, aerospace, banking, insurance, energy, and telecommunications.  He was a key contributor in the early development of an Agile Adoption Practice and Agile Acceleration Roadmap and the IBM Rational Method Composer (RMC).  His experience includes significant work in relational database system design and implementation, and consulting and mentoring a wide variety of clients on iterative and incremental system / software development improvement strategies and approaches.  Russell is a member of the Scrum Alliance and the Agile Alliance.  He's also a Certified Scrum Master (CSM), Certified Scrum Product Owner (CSPO), and in the process of completing the paperwork to become a Certified Scrum Practitioner (CSP).


January 13, 2009

No SSQA meeting this month due to a room scheduling conflict.  See you on February 10 for "QA in the World of Agile Software Development".


December 9, 2008

No SSQA meeting this month.  Happy Holidays.  See you in 2009!


November 18, 2008

Test-Driven Development Best Practices   (presentation slides -- PDF)

Agile software development methods are gaining popularity.  The presentation will focus on one of the agile software development methods called Test-Driven Development (TDD).  The talk will provide an introduction to TDD and cover the latest terminology, including: test-code-refactor, test doubles, stubs, fakes, mocks, dependency injection, etc.  The talk will also cover some of the test frameworks employed in TDD, and discuss some TDD best practices and lessons learned.
Speaker:  Satya Dodda (Director of Software Quality Engineering, Sun Microsystems)

Satya Dodda is a Director of Software Quality Engineering at Sun Microsystems.  Currently, Satya leads a team of 75 engineers located in the US, France, Czech Republic, and India.  Satya has over 16 years experience in software development, testing, and management in: GlassFish App Server, Sun Java Enterprise System, Java SE Platform, Web Server, Message Queue, and Solaris Operating System.


October 14, 2008

Success with Test Automation   (presentation slides -- PDF)

John Green will present lessons learned and best practices from many years of developing test automation projects.  Sample coding standards, processes, tools, and things to avoid will also be presented.
Speaker:  John Green (Sr. Staff Engineer, VMware)

John Green has 20 years of QA experience and 12 years of automation experience, mostly with SilkTest.  He was a consultant and trainer for Segue Software (developers of SilkTest), later purchased by Borland.  John has helped over 200 teams use SilkTest successfully, and currently is a Sr. Staff Engineer at VMware, focused on automation tools and processes.


September 9, 2008

Beyond Testing - Achieving Software Excellence

Although testing is an essential part of achieving superior software quality, it's not sufficient.  Additional ways must be found to improve software quality.  Even before the introduction of CMM, it was well recognized that the development process can have a significant impact on software quality.  This means that the testing team needs to understand the development process in use and should seek to influence the process.  This talk will cover several simple suggestions for the development process that have proven to have significant beneficial impact on software quality.
Speaker:  Yashwant Shitoot, CSQE, PMP

Yashwant Shitoot is a software engineer and project manager with over 25 years of industry experience.  He's spent most of his career on the development side.  Eight years ago, Yash began working on medical device software.  Since then, he's become increasingly involved with software quality issues.  Yash developed the class "Building Quality into Software Code" for UCSC Extension and will speak from his experience as a manager of mission-critical, real-time embedded software projects in the FDA arena.  Yash earned an MS in Physics at Auburn University.


August 12, 2008

Enterprise Software Testing:  How To Ensure Enterprise Software Is Highly Available

Today's enterprises must be available 24x7 to handle customer and partner requests.  This places hard requirements on these systems to be highly available with minimum down time.  These hard requirements mandate having clustered systems with hardware and software redundancy.  Ensuring high availability presents complex testing challenges.  Our speaker will provide an overview of highly available systems and the definition of 5-9's availability.  Also, our speaker will cover high availability testing methodologies, testing tools, and testing techniques, such as load balancing and failure injection.
Speaker:  Sriram Lakkaraju (QA Manager, Sun Microsystems)

Sriram Lakkaraju has 12 years of experience in the computer software industry.  Currently, Sriram is a software engineering manager at Sun Microsystems, managing testing and harness tools development groups that test the SailFin (OpenSource Telco Application Server) and GlassFish Enterprise Server products.  Sriram has worked in several areas of Java technologies, including the Java SE platform, JAAS, JSSE, JSP, Servlets, JSTL, and EJBs.  He has extensive hands-on experience with Sun products such as Web Server 6.0 and GlassFish v1 and v2.  Sriram has presented at JavaOne and holds an MS degree in Computer Science from the University of South Carolina.


July 8, 2008

Test Lab Virtualization

As more enterprises and independent software vendors seek additional ways to leverage virtualization technology, Virtual Lab Automation (VLA) has emerged as an innovative solution for streamlining software development and automating the entire development and test environment setup, while utilizing existing server virtualization infrastructure.  In addition, VLA improves resource utilization and efficiency while pushing products to market faster.  This presentation will review the virtual test and development infrastructure and provide best practice recommendations for how VLA can add significant value to developers, testers, and IT operations staff and help drive business growth and employee productivity.
Speaker:  Jim Singh (Director of Technology, VMLogix)

Jim Singh joined VMLogix, Inc. in January 2008 as Director of Technology, and focuses on making the VMLogix lab management product easier to deploy and use.  Jim's 12 year career in software includes positions in development, QA, and professional services.  He was first introduced to automated test lab management during his time at Trilogy Software and has kept abreast of the space since 1999.  Jim received his BS in Computer Science from Cornell University and is a patented software developer.


June 10, 2008

Total Automation!

Sachin Bansal of Adobe Systems will discuss automation strategies that could be followed (regardless of product or language) to achieve the TOTAL automation of quality engineering tasks in a fast-paced software life cycle.  Sachin will demonstrate different automation subsystems, how they work together, execute tests, collect data, archive data, and present data on-demand for analysis.  The architecture of user-friendly regression, performance, and reliability test automation frameworks for system testing will be discussed.  Sachin will also outline some challenges of and learnings from an ongoing journey in Total Automation.   With the help of distributed automation, his teams have been able to reduce testing time while increasing test coverage.  Sachin will present actionable suggestions for solving these critical problems, and provide a road map for successful global testing and test automation.
Speaker:  Sachin Bansal (Sr. Quality Engineering Manager, Adobe Systems)

Sachin Bansal is Senior Quality Engineering Manager at Adobe Systems.  He has extensive experience in designing and developing customized user-friendly server-side testing frameworks.  His research interest is mostly in client-server and database technologies.  Before joining Adobe, he held engineering and management positions at i2 Technologies and BlackPearl Inc.  Sachin graduated from the Indian Institute of Technology (IIT) Kanpur and has a Masters from the University of Wisconsin-Madison.  He's an enthusiastic speaker who spoke previously to SSQA in February 2007 and demonstrated an early prototype of a total automation system.


May 13, 2008

Enterprise 2.0 is here - Upgrade your Test Department!

Enterprise 2.0 has been defined as: flattening an organization, making it agile and flexible; harnessing the distributed and global aspect of its structure, making it simple and transparent; and utilizing on-demand and emerging information systems, shortening time-to-market cycles.  Does this describe your test department?  Samir Shah takes you through what all of this means to your test department and what you could be doing to upgrade it to Enterprise 2.0.  In this day and age of global outsourcing, new technologies and systems to test, newer test methodologies, SOAs and integrations, distributed computing and mashups... very little attention has been paid to bringing the test department into this new world and equipping it with the right toolsets, leaving frustrated managers with archaic, monolithic toolsets that are driven by projects and events.
Speaker:  Samir Shah (Founder/CEO, Zephyr)

Samir Shah is a QA Executive with over 16 years of direct test management experience, ranging from startups in Silicon Valley to the Global Fortune 50.  He's the Founder and CEO of Zephyr, an Enterprise 2.0 startup based in Sunnyvale, CA.  Most recently, he was Vice President of QA Practice at Patni Computers where, over a period of 6 years, he founded, built and managed a 300+ person Global QA Practice generating $75 million in testing revenue.  His broad experience includes various engineering roles at Wollongong, Attachmate, and PointCast, helping customers such as AT&T, Virgin Mobile, Disney Mobile, Du, Good Technologies, Vodafone, WhitePajama, Bulldog, Grand Central, RCC, Visage Mobile, KnowNow, and others.  Samir holds a Bachelors Degree in Electronics from Bangalore University and a Masters in Electrical Engineering from the University of Alabama.  Having been on the "other side" and perpetually unsatisfied about the quality, pricing and user experience of existing test management tools, Samir founded Zephyr to take on the challenge of bringing to market a refreshing new way of managing test teams.


April 8, 2008

The SQA Approach on the Mozilla Project - How Firefox gets Tested   (presentation slides -- PDF)

QA is a challenge in any organization, but open source development adds extra dimensions to that task.  Come hear Mozilla's Director of QA speak about testing in the creative world of open source software and how the Mozilla Project combines the effort of 22,017 test engineers and community volunteers to bring together specific SQA strategies, tools, and infrastructure.  Our speaker will cover Mozilla's approach to developing and executing both manual and automated tests, focusing on the testing of Firefox, Mozilla's award-winning web browser.
Speaker:  Tim Riley (Director of Quality Assurance, Mozilla Corporation)

Tim Riley is Director of Quality Assurance at Mozilla Corporation.  Tim leads a team of 20,000 nightly testers (!), 1600 identified QA volunteers (!), 400 developers writing unit test cases (!), and the 17 members of Mozilla's SQA team.  Before joining Mozilla, Tim managed teams testing high security operating systems and J2SE JRE/JDK at Sun Microsystems, as well as other test teams.


March 11, 2008

Testing a Cool Internet Technology called "Ad Serving Systems" (think Google and Yahoo!)

This presentation will address testing one of the latest internet technologies, "Ad Serving Systems".  Using a hypothetical Ad Server System as an example, the speaker will describe the goals, main functionality, key modules, and complexity of a contemporary Ad Serving System.  The speaker will describe the QA challenges faced and discuss solutions (test tools and test frameworks) employed to address those QA challenges.
Speaker:  Madhava Avvari (QA Manager, Ad Serving Systems, Yahoo!)

Madhava Avvari currently works as a QA Manager in Ad Serving Systems at Yahoo!  Before Yahoo!, Madhava managed or led various high performing QA teams testing the complex HotSpot Virtual Machine and other cool Java technologies at Sun Microsystems.  Prior to Sun, Madhava worked at Cisco-HCL and HCL Technologies.  Madhava's interests include quality engineering, distributed systems, and computer 3D-animation.  Madhava holds 5 patents on product quality and test automation technologies/systems.


February 12, 2008

Clichés, Metrics, and Methods:  A Discussion of the Quality System and its Role in Contemporary Software Development

With several well-known software development clichés as a starting point, this presentation looks at the primary forces shaping a software development project and how these forces are typically balanced in a commercial software development project.  We'll discuss the implications for quality goals, system, and methods.  The talk will discuss the need for lightweight and flexible processes, combined with efficient, effective and quantifiable defect containment.  The presentation looks at methods and techniques to enhance efficiency and proposes simple metrics to monitor in-process and overall defect containment effectiveness.  A section of the presentation (MICRO Methodology) looks at procedures and tools generally needed for any software development project, independent of development methodology, and with a potentially huge impact on team performance.  The presentation takes a broad view of software development, in general, and quality assurance, in particular.  Rather than attempting to provide complete and final answers to specific problems, the goal of the talk is to spark discussion, and maybe make us stop for a moment and think about how we do what we're so passionately doing.  If time allows, we'll consider some slightly more philosophical aspects of development methodology and take a quick look at how Isaac Newton, Albert Einstein, Niels Bohr, and Brian Greene (superstring theorist) might have gone about developing software.
Speaker:  Peter Jensen (Software Architect, Sun Microsystems)

Peter Jensen has been with Sun Microsystems since 1997, and has held various engineering roles related to Telecommunications and Java technology.  As a hardcore developer and software architect, Peter's experience ranges from real-time operating systems, to compilers and middleware, to enterprise applications.  In recent years, he has spoken at the JavaOne conference on subjects of telematics and mobile-computing content delivery.  Three to four years ago, a stint doing performance testing for Sun's Content Delivery Server led to an increased interest in Quality Engineering.  Since then, Peter has worked to help define and improve development and testing practices and tools for Sun's internal and open source Java ME projects.  Before joining Sun, Peter worked for Chorus Systems in France and the US, where he was part of a small team developing a high-performance CORBA2 implementation for embedded and real-time applications.  Peter graduated from Aarhus University (Denmark) in 1991 with a Cand. Scient. degree in Computer Science and Mathematics.


January 8, 2008

Exploring an Expanded Model for Software Under Test

Many testers think of the test exercise using the simplest model for the SUT (Input/Process/Output).  More complex models facilitate the design of better tests, help us understand the nature of what we're testing, and interpret the observed results.  Doug has developed an expanded model, which represents the influences on SUT behavior and the domains for possible outcomes.  The talk will be a presentation and interactive discussion of the model and some of its implications.
Speaker:  Doug Hoffman (Consultant, Software Quality Methods LLC)

Doug Hoffman has more than 30 years experience as a consultant, manager, and engineer in the computer and software industries.  He's currently an independent consultant with his company, Software Quality Methods.  Doug has been Chair and Program Chair for several local, national, and international quality conferences; he has also been a speaker at numerous conferences.  Doug is a Past Chair of the Silicon Valley section of the American Society for Quality (ASQ) and the Silicon Valley Software Quality Association (SSQA).  He's a founding member and past Member of the Board of Directors of the Association for Software Testing (AST), as well as a member of ACM and IEEE, and a Fellow of ASQ.


December 11, 2007

No SSQA meeting this month.  Happy Holidays.  See you in 2008!


November 13, 2007

Security in the Software Development Life Cycle

The best way to incorporate better security in any software development life cycle is to have a well-defined security process in place.  Yet, the reality in the current marketplace is that there is a heavy emphasis on security tools while ignoring the principle of incorporating security processes.  The bottom line should be "Process comes first, then tools."  Better testing processes like parameter validation (using the cardinal principle that "all input from users or external systems is evil until proved otherwise") and diligent code reviews (to catch logic bombs and poor coding practices that lead to vulnerabilities) will provide more bang for the buck than spending money on security audits of production-ready systems.  Understanding the fundamentals of setting in place better processes in the SDLC will be discussed in detail.  Using existing open source tools to perform security analyses on the code base will be demonstrated if time permits.
Speaker:  Murali Nandigama (Senior Development Manager, Oracle Corporation)

Murali Nandigama is a Senior Development Manager at Oracle and is Release Head for the Oracle Application Server 10g release.  Murali holds a PhD degree in Physics and has more than a decade of software industry experience in testing, development and management of QA, Performance, Security, and Release Management.  Murali is a Certified Software Quality Analyst, Senior Member of the IEEE, and a member of the IEEE Standards Association.  Murali has published research papers in multiple peer reviewed journals and has been a speaker at conferences on Quality, Security, and Process Automation techniques.  Murali has filed patents on QA and advanced process management methodologies and holds three patents in software testing tools and methodologies.


October 16, 2007

Life is not static, so why are your test cases?

Protocols are curious beasts.  They're the ultimate interface of any hardware/software product to the external world, be it file formats, APIs, communication protocols, RPC or command line.  They're all intimately connected through re-use of constructs and patterns of vulnerabilities.  Testing protocols is an effective way to both unit test as well as system test a hardware/software deployment.  More importantly, by analyzing the attack surface exposed by these protocols, one can hone in on the bugs that matter for expedited remediation.  In the connected world, there's not much difference between a bug and a vulnerability.  This talk will discuss these constructs, vulnerability patterns, and how a voltage regulator that's IP-enabled has much in common with web services.
Speaker:  Kowsik Guruswamy (Co-founder and CTO, Mu Security)

Kowsik Guruswamy is co-founder and CTO of Mu Security.  Prior to founding Mu, Kowsik was a Distinguished Engineer at Juniper Networks and the Chief Architect for the Intrusion Prevention product line.  Kowsik joined Juniper Networks through the acquisition of NetScreen/OneSecure, where he designed and architected the first inline Intrusion Prevention device.  He holds eight patents in various networking and security technologies and has an MS in Computer Science from the University of Louisiana.


September 11, 2007

A Graphical Display of Testing Status for Complex Configurations

Representing the status of software under test is complex and difficult, compounded when there are many interacting subsystems and combinations that must be tracked.  This paper describes a method developed for a one-page representation of the test space for a large and complex set of product components.  The latest project this was applied to had 10 interdependent variables and over 250 components.  Once the components are identified and grouped, the spreadsheet can be used to show configurations to be tested, record test outcomes, and represent the overall state of testing coverage and outcomes.  The paper uses a sanitized example modified from an actual test configuration.
Speaker:  Doug Hoffman (Software QA Program Manager, Hewlett-Packard)

Doug Hoffman has more than thirty years experience as a consultant, manager, and engineer in the computer and software industries.  He is currently a Software QA Program Manager for Hewlett-Packard.  Doug is extremely active in quality communities.  He has been Chair and Program Chair for several local, national, and international quality conferences; he has also been a speaker at numerous conferences.  Doug is a Past Chair of the Santa Clara Section of the American Society for Quality (ASQ) and the Silicon Valley Software Quality Association (SSQA).  He is a founding member and past Member of the Board of Directors for the Association for Software Testing (AST), and a member of ACM and IEEE.


August 14, 2007

Testing in the World of Open Source Software

QA is always a challenge in any organization, but open source development adds extra dimensions to that task.  Come hear the Director of QA at Mozilla speak about testing in the creative world of open source software.
Speaker:  Tim Riley (Director of Quality Assurance, Mozilla Corporation)

Tim Riley is Director of Quality Assurance at Mozilla Corporation.  He leads a team of 10,000 part-time testers (!) and 16 full-time testers.  Tim has also managed teams testing high security operating systems, J2SE JRE/JDK, and a hosted web services network.


July 10, 2007

Closing the Loop On Quality - Integrating Customer Feedback

Quality teams are effective at testing against software requirements, but they often don't get relevant data to feed back into their test development process.  This presentation focuses on lessons learned integrating customer feedback into the quality process.
Speaker:  Gopal Jorapur (Staff Engineer, Sun Microsystems)

Gopal Jorapur is a staff engineer at Sun Microsystems.  He's always seeking creative ways to improve existing processes.  Gopal earned a BS degree in Electronics and Communications.  He did his degree project work in systems software and has 9 years of experience in software technologies and processes.


June 12, 2007

Next Generation Testing with TestNG

TestNG (http://testng.org) is an open source testing framework designed to cover all aspects of testing, from unit to functional and everything in-between.  TestNG has innovative features and is geared towards professional developers in search of a testing framework that covers all styles of Java code, from mobile to enterprise.  This popular test harness offers a number of enhancements relative to JUnit.  Cédric's talk will illustrate several TestNG features that enable advanced testing techniques, such as: Multi-thread testing; Data-driven testing; Using groups for better organization of tests; Dependent testing; and much more.
Speaker:  Cédric Beust (Senior Software Engineer, Google)

Cédric Beust is a co-founder of the TestNG framework and a senior software engineer at Google.  Cédric holds a PhD in computer science; his interests include aspect-oriented programming, testing, tools, back-end and GUI, and everything related to software engineering in general.  Cédric is a co-author of "Professional Java Server Programming J2EE 1.3" (Wrox Press) and maintains a well-read blog at www.beust.com/weblog.


May 8, 2007

Developing and Using a Defect Removal Model to Predict Customer Experiences on Software Products

Reliability is a key requirement for Hewlett-Packard's NonStop Server Systems.  The software that goes into these systems has to be of the highest quality.  A prediction of software quality can help better control development practices to achieve desired quality goals using available resources.  Several software defect models exist in the industry.  Growth models use statistical distributions to predict customer experience.  Other models predict customer experience based on past project, product, and the development organization's characteristics.  The speakers will share their experience of developing and using a phase-based containment model, where the effectiveness of defect removal activities is used to predict customer experience.  Larry and Sujoy will describe how they implemented the model in the organization, to assess product quality in each phase in a project's life-cycle, and how quality information is aggregated to make a release level prediction of what the customers would experience.  They will also share key benefits and lessons that have been learned as a result of the defect removal initiative.
Speakers:  Larry Steinhaus and Sujoy Ghosh (Program Managers, NonStop Division, Hewlett-Packard)

Larry Steinhaus is a program manager for Hewlett-Packard.  His primary focus is software QA improvement and software process improvement programs.  Larry has 17 years of software industry experience.  He's been a QA developer for the HP NonStop Operating System, a QA manager, and Program Manager at HP.  Larry earned BS and MS degrees in computer science from CSU, Chico. Sujoy Ghosh is a program manager for Hewlett-Packard.  His primary focus is software process improvement, software QA improvement, and customer issues.  Sujoy has 15 years of software and quality management experience.  He's been a software developer, Quality Manager and Program Manager, at HP and other companies in the US and India.  Sujoy earned a BS in Engineering from the Indian Institute of Technology (IIT) Delhi, an MBA from XLRI (India), and an MS in Software Engineering from Carnegie Mellon University.


April 10, 2007

The Software Project as a Journey

There have been many comparisons between software projects and other kinds of efforts.  Building a house or bridge, or some other engineering or architectural feat.  Another analogy which can also be useful is viewing a software project as a kind of journey.  You start out.  Things happen along the way.  You arrive at a destination.  So what makes any journey a success?  There are many possible criteria.  One famous example was "to get to the moon and return safely by the end of the decade."  For software projects, Brian Lawrence suggests that a worthy criteria might be "to fulfill the objectives of the sponsor."  Another could be "to make tons of money."  In this presentation, Brian will examine several journeys -- some where people traveled from place to place -- and some software journeys, which started with an idea and arrived at a destination.  All these journeys, physical and software, either succeeded or failed.  Why is it that some journeys succeed, while others fail?  What are the critical success factors?  Brian will assert that, for both software and physical journeys, some of the factors are exactly the same.
Speaker:  Brian Lawrence (Principal, Coyote Valley Software)

As a consultant, Brian Lawrence (brian@coyotevalley.com, www.coyotevalley.com) teaches and facilitates requirements modeling and management, peer reviews, project planning, risk management, life cycles, and design specification techniques.  Brian served as a program chair for the SEPG'97 Conference and the 1998 International Conference on Requirements Engineering.  Brian also served on the editorial board of IEEE Software and as the editor of Software Testing and Quality Engineering magazine for 2000.  In addition, Brian is an instructor in the UC Santa Cruz Extension program in Software Engineering.


March 13, 2007

Requirements Management, an Integral part of Quality Release

Ambiguous, incomplete, and changing requirements are responsible for many software project failures.  Therefore, requirements management becomes a key component in project success and software quality.  Requirement clarity, definition (of attributes and constraints), storage, and change management are elements of requirements management that contribute to a quality product and realizing improved customer satisfaction.  Managing your requirements successfully means having complete visibility and accountability so your organization understands where and what the requirements are across the software life cycle.  Anita Wotiz will present an overview and key topics of requirements management, its role in the software development life cycle, and will describe how each element contributes to software quality and helps to meet project delivery dates.
Speaker:  Anita Wotiz (Program Coordinator, Software Engineering, UCSC Extension)

Anita Wotiz (wotizconsulting@yahoo.com) has been a software engineer, system architect and senior-level manager at several Silicon Valley companies such as Ford Aerospace / Loral / Lockheed and Lucent Technologies, and most recently Vice President of Engineering at Azerity, a small enterprise application software company.  Her past technical work includes development of real time, embedded systems.  Her total experience allows her to understand which core practices work and how to tailor best practices to fit an organization's needs.  Anita is the Program Coordinator for both the Software Engineering & Management and Software Engineering programs at UCSC Extension.  She is an instructor for the courses, Software Requirements Engineering and Tracking Software Quality & Project Progress.  She holds a B.S. in Mathematics and an M.S. in Computer Science.


February 13, 2007

How to Design Regression Test Automation Frameworks for System Testing   (presentation slides -- PDF)

Sachin Bansal of Adobe will discuss designing modular, customizable and user-friendly regression test automation frameworks for the system testing of servers.  He will explain how to identify, design, implement, and execute complex automation frameworks involving different technologies.  Different frameworks will be presented to elaborate on the challenges and lessons learned.  Sachin will also discuss the challenges faced during integration of the bug tracking system, test case management system, performance testing system and automation system.
Speaker:  Sachin Bansal (Senior Quality Engineering Manager, Adobe Systems)

Sachin Bansal (sbansal@adobe.com) currently is Senior Quality Engineering Manager at Adobe Systems Inc.  He has extensive experience designing and developing custom created user-friendly server-side testing frameworks.  His research interest is mostly in client-server and database technologies.  Before joining Adobe, he held engineering and management positions at i2 Technologies and BlackPearl Inc.  Sachin graduated from the Indian Institute of Technology (IIT) Kanpur and has a Masters from the University of Wisconsin-Madison.


January 9, 2007

Keeping Score - How to Know When You're Done   (presentation slides -- PDF)

Knowing when a product is ready to ship is one of the hardest questions companies have to answer, probably second only to its corollary, "How long will it take to build?"  A simple answer, some might think facetious, is "When it passes its tests."  This presentation presents a business case for doing just that -- scoring product completion via the state of its testing.  Unlike the common wisdom of looking at error rates or bug report frequencies, this approach predicts the total number of tests required to exercise the product completely and keeps score of the product's readiness through a few simple measures.  It is implemented via a novel approach to regression testing.  As a side effect, it gives the quality team's efforts significant visibility in the product development process.
Speaker:  David Roland (Senior Computer Scientist, Computer Sciences Corp., NASA Ames Research Center)

David Roland is a Senior Computer Scientist with Computer Sciences Corporation at NASA Ames Research Center.  David has over thirty years experience in the aerospace and software industries.  An aerospace engineer by training, with eight years experience at Douglas Aircraft before moving to NASA Ames Research Center, the first time for six years when he developed graphical data preparation and analysis software using advanced surface modeling and visualization techniques.  There he was an early adapter of Fagan Inspections.  He moved into the commercial world as a developer and quality practitioner, including Software Quality Manager, in various industries including FDA regulated life sciences companies and Silicon Valley start-ups.  He returned to NASA Ames four years ago, working on DBMS for data mining and ground systems for Mars Exploration.  His presentation is based on ideas he acquired while working with a Fujitsu quality manager, Hajime Sugiura, at a now defunct startup.


December 12, 2006

Book Signing and Holiday Festivities with Local Authors

This month, we will celebrate the efforts and accomplishments of community members who have contributed to the body of knowledge in software quality and testing.  Four authors will be present and each will take about 10 minutes to present the primary theme of his/her book.  Afterwards, we'll enjoy holiday food and beverages and authors will be happy to sign books.  Copies are available for purchase, but it's better if you buy yours beforehand.  All books are available for purchase from http://www.amazon.com.
Speakers:  Tim Stein, Jason Reid, Alka Jarvis, James Cunningham

- R. Timothy Stein.  He founded Business Performance Associates in 1994, which has consulted with more than 80 companies from 18 industry segments, particularly the medical device, pharmaceutical, biologics, and diagnostic industries.  Stein wrote "The Computer System Risk Management and Validation Life Cycle".  This book is the first technical manual to integrate computer system validation, risk management, and system implementation into a single, easy-to-use process.  Easily understood by system users and IT professionals, this book explains basic concepts and translates them into how-to deliverables to simplify the tough decisions associated with a wide range of systems and their potential risks of failure.

- Jason M. Reid.  He is a test engineer at Sun Microsystems working in the Solaris System Test group.  He has also been an SQA engineer in the Developer Tools group.  Reid wrote "Secure Shell in the Enterprise".  This book covers the technical aspect of secure shell, but as its title states it also covers the methodology of using secure shell in a large environment.  It describes what secure shell does and it also goes into very nice information about the logistics of using SSH as well as information that is directly related to SSH, like authentication, public and private keys, and numerous other aspects that help to give you an understanding on how SSH can be used in an enterprise setting.

- Alka Jarvis.  She is Manager of Software Quality at Cisco Systems and a certified quality lead auditor (ISO 9000).  Jarvis wrote "Inroads to Software Quality" and "Dare to Be Excellent".  The first book, "Inroads to Software Quality", is used as a text book in the MBA program of Santa Clara University, UC Berkeley-Extension and UCSC-Extension.  The second book, "Dare to Be Excellent", describes the successful software practices of ten companies including Intel, Texas Instruments, Cisco Systems and others.

- James A. Cunningham.  He has a 25-year history in the corporate semiconductor industry, working for such powerhouse companies as TI, National Semiconductor and AMD, often at the vice president level.  He holds 46 patents and has published 18 technical papers, a 200-page book on CMOS technology, and a book in Japan in the 1980s concerning the growing strength of the Japanese semiconductor industry.  Cunningham has written "The Hollowing of America".  This book analyzes the economic ramifications of America's growing loss of domestic manufacturing and the associated massive trade imbalance.  Tells how to survive if not prosper in the face of a massive decline in the dollar.


November 14, 2006

Test Automation Beyond Regression

Most testers think of GUI based scripted regression testing when they picture test automation.  This is a very limited view of the potentially vast possibilities open to us when automating tests.  When we think of test automation we should first think about doing things that we can't do manually.  This talk is about the limitations and how other kinds of test automation may be much more valuable.
Speaker:  Doug Hoffman (Program Manager, Hewlett-Packard)

Thirty years experience as a consultant, manager, and engineer in the computer and software industries.  Twenty years experience in evaluating, creating, and turning around software quality departments in medium and small start up companies.  Experience with corporate, quality assurance, development, manufacturing, and support functions and procedures.  Management Consultant, Course Developer, Instructor, Author, and Project Manager for Software Quality Methods, LLC (www.SoftwareQualityMethods.com).  Adjunct Instructor for the University of San Francisco.  Instructor for UCSC-Extension, "Fundamentals of Software Test Automation", ASQ-CSQE, ASQ-CQMgr, ISO 9000 Auditor.  Speaker at numerous conferences.  Doug is working now as a Program Manager at Hewlett-Packard and can be reached at doug.hoffman@acm.org.


October 10, 2006

The 5 Percent Rules of Test Automation

Successful automation is every test manager's dream.  It will shorten time-to-market life cycles and improve quality assurance.  Testers would love it too, since it can alleviate much of the boring work of executing tests and free up time to design better tests and to better follow up on test outcomes.  However, practice is reluctant.  Most test automation tools either end up on the shelf collecting dust or are at best used for a small part of the testing.  When is test automation successful?  In this talk, Hans will argue that stable automation is only achieved if a large percentage of test cases can be executed automatically and the automation does not take too much time away from the testers.  To provoke the issue, he will challenge us with the following two rules:  (1) No more than 5 % of all tests should be executed manually, and (2) No more than 5 % of all efforts around testing should involve automating the tests.  Apart from presenting the rules and why he feels they are important, Hans will mostly talk about how to meet the rules, using actual projects to illustrate a number of key principles to drive automation success: (1) test design, (2) automation architecture, and (3) organization.  He will introduce his Action Based Testing framework as an example of a methodology with which the 5 % standards can be achieved.
Speaker:  Hans Buwalda (Chief Technology Officer, LogiGear Corporation)

Hans Buwalda is an internationally recognized expert in test development, test automation, and test management.  He was the first to present test automation with keywords, which he further developed into what is now Action Based Testing.  Hans also developed the concept of Soap Opera Testing, which he wrote about in the February 2004 issue of Better Software Magazine.  Hans is a well-known speaker at international conferences and a co-author of "Integrated Test Design and Automation", published by Addison-Wesley in 2001.  In September, two of Hans' articles were published: "Building Your Dream Team" in Better Software Magazine and "The 5 % Challenges Of Test Automation" in STP Magazine.


September 12, 2006

Who is Responsible for Quality?

Since the internet boom, small and large companies alike are driving their teams to work faster.  Since the tech-bust, small and large companies are asking their teams to do their work with less people.  The squeeze is on, teams miss their goals, software is shipped before its time and, in the end, the customers suffer.  Whose job is it to make sure we produce quality software?  How do we build quality into the planning and development process?  How do we avoid creating adversarial situations between those trying to meet company goals and those trying to ensure quality?  Come and hear Mark talk about his experiences and answers to these questions!
Speaker:  Mark Himelstein (President, Heavenstone Inc.)

Mark Himelstein has been managing software organizations since 1984.  He has led both small and large organizations in startups and Fortune 500 companies.  Mark's experience running the worldwide Solaris engineering group at Sun Microsystems Inc. helped solidify some of the concepts found in his book, "100 Questions to Ask Your Software Organization".  He also draws from experience running his own companies.  Mark is currently President and CEO of Heavenstone Inc. (www.heavenstoneinc.com), a software development and management consulting firm.  He earned a Bachelor of Science in both Computer Science and Mathematics and a Master's in Computing Science.  He holds four patents and has published a number of technical papers.


August 8, 2006

Quality-Driven Build Scripts for Java Applications   (presentation slides -- PDF)

Agile build scripts not only compile code but also provide important information about the product. These documents and metrics vary in their objectives, but they facilitate the attainment of a common goal: building quality software. Unlike physical goods, obtaining information about applications is relatively inexpensive and disproportionately beneficial to engineering efforts. We'll explore open-source, off-the-shelf utilities for creating quality-driven build scripts for Java applications.
Speaker:  Duy Bao Vo (Graduate Student, San Jose State University)

Duy Bao Vo is a software engineer at CyberSource Corp. in Mountain View and a graduate student in San Jose State's Computer Science program. He has a BS in Applied Mathematics, Computer Science, and Physics from San Jose State and is currently working on his thesis on persistence design patterns for use with EJB 3.0 and POJOs. In his spare time, Duy works with Aid to Children Without Parents, the AFL-CIO South Bay Labor Council, and World of Good.


July 11, 2006

Metrics, Benchmarking and Predictive Modeling

How does a company measure and deliver on customer success? In this interactive session, David will share how Cisco links Customer Lifetime Value, Customer Loyalty, Customer Satisfaction, Product / Service Satisfaction, Product / Service Quality Experience, and Product / Service Design. He will be eliciting feedback on how to set customer experience targets based on customer explicit and derived needs, competitive pressures, industry best practices, and company process capability. We will discuss the role and approaches to benchmarking in helping set these targets as well as the precision and key elements to predictive modeling needed to assure that the company delivers on these measurable goals. Please come to this engaging session to learn, share, and discuss these key elements to measuring and delivering on customer success.
Speaker:  David Hsiao (Director, Metrics Strategy and Benchmarking COE, Cisco Systems)

The Cisco IOS® operating system software runs on nearly all of Cisco's routers, switches, and signal aggregation devices. David Hsiao led the Software Success Engineering team that defined, measured, analyzed, improved, and controlled quality initiatives to improve Customer satisfaction with Cisco IOS® software. David has recently assumed the new responsibilities with Cisco's Corporate Quality department, heading up their metrics strategy and benchmarking center of excellence. Before joining Cisco, David served as Assistant VP of Quality at SAIC / Telcordia, Director of Process Leadership at AT&T (GRC International), improved quality management systems at Telcordia, Global One, US West, Network Solutions, State Farm, and designed / delivered information systems for Boeing, Ford, and the US government.


June 13, 2006

Fighting the BUG WAR with Inspections and Reviews: A Success Story

Understanding the quality of your systems is best aided by proper defect classification and analysis so that the right practices are followed, the right policies are chosen, and the tools are used in the right way.  Through the use of careful logging and analysis of the inspection and review results, learn how Cadence applies the lessons learned to reduce its engineering costs.  Topics covered are:
Speaker:  Duvan Luong, PhD (Technical Director for Enterprise Quality, Cadence Design Systems)

Duvan Luong is currently Technical Director for Enterprise Quality at Cadence Design Systems (www.cadence.com) in San Jose.  Duvan has had many years of experience working in the area of Software Quality Engineering for many leading companies: AT&T Bell Labs, IBM, Synopsys, Sun Microsystems, and Hewlett Packard.  Duvan's passion is in the area of Quality Engineering, Software Testing, and organizational transformation for sustaining operational excellence.   Duvan has a PhD degree in Computer Science, with an emphasis on Software Testing, from Lehigh University.


May 9, 2006

Using Data-Driven Analysis to Increase Customer Satisfaction   (presentation slides -- PDF)

Learn how the Cisco Systems Technical Services Group uses a combination of statistical tools and integrated data analysis to identify key drivers of customer satisfaction and loyalty.
Topics that will be covered are:

    * Apply various statistical methods, such as:
          o Qualitative data integration using Linkage Analysis
          o Multiple regression analysis modeling
          o Other processes borrowed from the Six Sigma toolbox
    * Present data that gets attention
    * Translate feedback into improvement initiatives
    * Monitor progress and measure success
    * Apply what we learned
Speakers: Lisa K. Arnold (Customer Satisfaction Analyst, Cisco Systems)
                  Anu Ranganath (Voice-of-the-Customer Program Manager, Cisco Systems)


Lisa Arnold has a BS in Psychology and an MS in Quality Systems Management.  For five years, Lisa has worked at Cisco Systems as a Quality Systems Engineer focusing specifically on Customer Satisfaction with Cisco's products.  Prior to this, she worked as a Training Program Manager, and spent 10 years as a Technical Support Specialist.  Lisa moved to California from Massachusetts in 2005 to concentrate on driving adoption of the customer satisfaction program further into the Cisco organization.

Anu Ranganath works at Cisco Systems within the Global Technical Services organization.  Her expertise is in understanding Customer Experience, making data actionable across the Services organization.  Prior to this, she worked as a Customer Insight and Change Management Manager at Sun Microsystems within Customer Advocacy and IT.  Her specialty is linking her deep understanding of customer experience with organizational priorities.  A passionate "customer advocate", she enjoys researching best practices to create customer-focused organizational change.  She is the author of several research papers, previously published at Sun Library.


April 11, 2006

Software Testing as a Career – Still Viable?

Much has been made of how offshoring has robbed Silicon Valley of its high tech jobs, with new opportunities springing up in such far-flung places as India, China, and Eastern Europe.  But what's the real impact today?  While the SSQA membership has seen profound improvements in employment over the past year, come hear the thoughts of Mikhail Portnov of Portnov Computer School in Mountain View, CA.  He experienced first hand the explosive growth of the 90's, where job needs went from a handful of openings to hundreds of open positions and then back down in the early 2000's.  Yet, Mikhail remains positive about the outlook for software QA and test professionals.  As the recent job market revives, his students (many of whom are well-educated and legal immigrants) and their spouses are finding QA and testing positions more quickly again.  Mikhail will cover the following topics:

    * Different venues for QA training today
    * Different niches in the QA training market
    * Specific ideas for short-term career transition for degreed students
    * Realities of today's job market
    * Thoughts on local QA resources versus offshore testing
Speaker: Mikhail Portnov (Founder, Portnov Computer School)

Mikhail Portnov in 1994 founded Portnov Computer School (www.portnov.com), a school that specializes in teaching software testing skills to qualified students within 2 to 3 months and helping graduates find a job in QA.  Mikhail has degrees in electrical engineering (telecommunications) and math, with post-graduate studies in the psychology of professional education.  He spent four years as a digital design engineer and six years in educational research in Russia.  He has worked for Borland, Lotus Development, BroadVision, Portfolio Technologies, and TestDrive Corp.  He has taught math and logic design at the college level since 1980.  Contact him at mikhail@portnov.com.


March 14, 2006

Software Engineering: Facts or Fancy?   (presentation slides -- PDF)

We hear many things that we either are or should be doing to deliver the right software in a timely and cost-effective manner.  Some of these approaches are very popular, some are less so.  But do we really know which of these approaches actually work?   And how would we know if they did?  In this presentation, Brian will offer some ideas and we'll examine some of the assumptions we make about software engineering.   Which things that you currently believe in are indeed fact?  And which are fancy?   Let's learn together.
Speaker: Brian Lawrence (Principal, Coyote Valley Software)

Brian Lawrence is principal of Coyote Valley Software (www.coyotevalley.com), a software consulting firm.  Brian teaches and facilitates requirements modeling and management, inspection, project planning, risk management, life cycles, and design specification techniques.  Coyote Valley Software has worked with software companies spanning the domain of the software industry, from out-of-the-gates software startups to package software publishers to defense contractors to biomedical and biotech companies.  In 2000, Brian served as the editor of Software Testing and Quality Engineering Magazine.  In addition to teaching in the University of California Santa Cruz Extension program in software engineering, he has also served as a program chair for the 1997 SEPG Conference as well as the 1998 International Conference on Requirements Engineering.  Brian served on the editorial boards for Better Software (STQE) and IEEE Software.  Contact him at brian@coyotevalley.com.


February 14, 2006

QA Road Warriors

QA has matured, and yet in many parts of the world there is a deep need for qualified QA professionals, individuals who know how it is done right.  While Silicon Valley continues to shake itself awake after its long slumber, these two professionals couldn't wait and sought opportunities and professional excitement outside the valley.  Doug Hoffman joined SDT and brought his assessment and training expertise to companies in India, China, Canada, and France.  Claudia Dencker brought her test leadership and management skills to an electrical utility company in Edmonton, Canada.  Share in their stories of excitement, trials and tribulations of what it means to take QA expertise on the road.
Speakers: Claudia Dencker (President, Software SETT Corporation)
                  Doug Hoffman (President & Principal Consultant, Software Quality Methods, LLC)


Claudia Dencker is President of Software SETT Corporation (www.softsett.com), a company specializing in the global delivery of software testing and QA solutions.  She has taught classes worldwide through the IEEE and Software SETT to major Silicon Valley companies.  She also participates on software testing projects in an advisory or hands-on role.

Doug Hoffman is President and Principal Consultant of Software Quality Methods, LLC (www.softwarequalitymethods.com) and is an ASQ Fellow, ASQ-CSQE, ASQ-CQMgr.  He has 30 years experience in the software engineering and quality assurance fields and now is a management consultant in strategic and tactical planning for software quality.


January 10, 2006

The T in Quality

This talk will focus on the T in Quality.  We'll start with the definition of QUALITY and contrast it to Quality Assurance and Quality Control.  We'll also talk about the institutionalization of quality and what types of support systems need to be in place to make this happen.  Finally, we'll review some case studies in implementing training to support Software Process Training.
Speaker: Lew Jamison (CEO / Learning Strategist, Performance Improvement Circle)

Lew Jamison is CEO and Learning Strategist at Performance Improvement Circle (www.picircle.com) and has been working in the software industry for over 20 years.  He's worked with companies such as Sun, Cisco, HP, eGain, Portal, PeopleSoft, CalTran and many more.  His company provides independent advice on establishing training infrastructures which sustain organizational learning.  Most recently, he's been involved with helping companies establish and sustain Organizational Training as part of the CMMI 1.1 (Capability Maturity Model Integration) of the Software Engineering Institute (SEI).


December 13, 2005

Quality Training: What's been your experience?

The speaker will facilitate a discussion on the audience's experiences with quality-related training in the corporate environment.  How useful has it been?  How does one measure its effectiveness?
Speaker: Lew Jamison (CEO / Learning Strategist, Performance Improvement Circle)

Lew Jamison is CEO and Learning Strategist at Performance Improvement Circle (www.picircle.com) and has been working in the software industry for over 20 years.  He's worked with companies such as Sun, Cisco, HP, eGain, Portal, PeopleSoft, CalTran and many more.  His company provides independent advice on establishing training infrastructures which sustain organizational learning.  Most recently, he's been involved with helping companies establish and sustain Organizational Training as part of the CMMI 1.1 (Capability Maturity Model Integration) of the Software Engineering Institute (SEI).


November 8, 2005

Estimating Software Size

Accurately projecting the size of a proposed software system remains the weakest link in the software cost estimating chain.  Deriving an appropriate size estimate is neither straightforward nor trivial.  Due to the lack of definitive information during the concept and design phases of software system development, size estimates made in those phases are characterized by uncertainty, generally resulting in estimates of very low credibility or validity.  Even as systems mature in their final stages (with requirements stabilized, all data inputs, outputs, and interfaces identified, and all processing functions clearly defined), the process of sizing software is still subject to a wide margin of uncertainty.  This presentation addresses the software sizing problem and discusses the Software Sizing Model (SSM) developed by Dr. Bozoki and in use worldwide. Also discussed will be how this model can help organizations address CMMI® model requirements regarding estimation and historical data.
Speaker: Dr. George Bozoki (Founder, Target Software)

Dr. Bozoki's professional pursuits over the past two decades have concentrated in the area of Software Engineering.  During this period, he has earned the reputation as an authority on the subject of software engineering metrics and models pertaining to size, effort, schedule, quality, and productivity throughout the software engineering community.  His work experience spans thirty five years -- twenty years of software engineering in the aerospace industry, eight years in academia as a university professor, and seven years working in industry in the area of operations research and engineering.  In 1980 he founded Target Software, a consulting firm specializing in Software Engineering.  In addition to consulting, Dr. Bozoki lectures and leads seminars dealing with software sizing and costing in the US, Europe, Australia and New Zealand.  Dr. Bozoki received his BSME from Technical University of Budapest in 1956, MSIE from Purdue University in 1965, and his PhD from Purdue in 1969.  His consulting firm's website is http://www.targetsoft-ware.com.


October 11, 2005

Logical Entity/Relationship Modeling: The Definition of Truth for Data

Logical Entity/Relationship (E/R) models, also referred to as "conceptual" or "semantic" models, define the information requirements of the enterprise, independent of the resulting implementation.  A well defined E/R model is the key to successful development of data oriented applications.  Although most frequently associated with relational databases, the logical E/R model is equally applicable to object oriented and XML implementation.  This presentation will provide an overview of the fundamentals of E/R modeling as the definition of the information requirements of the enterprise.  It will focus on the underlying concepts and notations, with a strong emphasis on the semantic content of the E/R model.
Speaker: Jeff Jacobs (Covad Communications, Jeffrey Jacobs and Associates)

Jeff Jacobs has over 20 years experience in software development, with a focus on software development methodologies and practices.  His management experience ranges from leading development for startups to overseeing multi-company efforts for communications satellite systems.  He has consulted to numerous companies and has trained over 3000 students in Information Engineering and various modeling techniques.  Jeff is the author of numerous papers and is a frequent presenter at technical conferences.  He received his B.S. in Information and Computer Sciences from UC Irvine, where he was one of the co-authors of UCI LISP.  Jeff is currently a Data Architect with Covad Communications, as well as a consultant providing services in software process improvement, methodology adoption and tailoring, and business/systems analysis and modeling.  His website is http://www.jeffreyjacobs.com.


September 13, 2005

Automation Techniques for Enterprise Application Testing

Enterprise applications are comprised of dozens of technologies and hundreds of classes, often developed and tested by scores of dispersed teams using disparate build and test frameworks. Integrating different components of applications that are developed by different teams often results in test bases with numerous build frameworks that are inefficient and a nightmare to enhance. Through this session you will be able to:
Proven testing strategies are presented and lessons are drawn from the Sun Java System Application Server SQE Team.
Speaker: Aditya Dada, Sun Microsystems, Sun Java System Application Server SQE Team

Aditya Dada has been with Sun's Application Server Quality Engineering for over 4 years. He is the lead architect for the automation framework currently being used to test the Application Server, and is an expert in deployment of enterprise applications. He is a Sun Certified Java Developer and a Sun Certified Business Component Developer.


August 9, 2005

A Process Driven Approach for Effective Application Service Quality for IT Organizations

A characteristic of most IT organizations is that new software development project controls and procedures are very rigorous while Operations and Maintenance controls and procedures are much less structured and accountable.  However applications are constantly changing, driven by ever changing business requirements and problem reports from the end user.  A lack of solid process can drive maintenance costs sky high and create a very chaotic environment.  Keith Mangold and Q Analysts have developed a model leveraging best of breed industry standards like ITIL, CMMi, TQM and CobiT to incrementally improve application service quality for large IT enterprises.  The model prescribes standardized procedures to provide the vehicle to drive focused process improvements that improve efficiency and reduce cost. This presentation focuses on software change, but is applicable to business process, data content and infrastructure changes.
Speaker: Keith Mangold,  Q Analysts

Keith Mangold has 20+ years of experience as a consultant focused on software engineering, project management and quality assurance procedures.  Keith has recently been involved in governance, change control, release management and risk management leading to the models to be presented at the SSQA meeting.


July 12, 2005

Early Testing Without the 'Test and Test Again' Syndrome

Developers and testers sometimes get into a frustrating dance in which the developers provide code for test, the testers run tests and document findings, developers fix the problems and re-release for testing, and the testers rerun and document new, different problems, and so on. For good reasons teams often begin "formal" testing on new software while it is still being coded. In this case the testers are working full tilt: running tests, investigating and isolating faults, writing up defects, rerunning the tests, and verifying fixes; but a lot of time is wasted by everyone on problems the developers already know about. As a manager, developer, or tester, you can break out of this vicious cycle and get to a better place.

Speaker: Doug Hoffman, SDT Corporation


June 14, 2005

Sabotaging QA: a Primer

In the trenches software development can often resemble low intensity conflict between the developers and QA. This presentation goes over various tactics development can use to demoralize, starve, divert, entrap or stall the QA team. Strategies are shown how to counter and ultimately resolve the conflict to unite the two warring sides in releasing a quality product.

Speaker: Jason Reid, Sun Microsystems

Jason Reid is a test engineer at Sun Microsystems working in the Solaris System Test group. He has also been an SQA engineer in the Developer Tools group. Prior to joining Sun, Jason worked at the Purdue University Computing Center as an UNIX system administrator while obtaining his BS in Computer Science.


May 10, 2005

The State of Spyware

Spyware has come of age as computer intrusions, infections and hack attempts have increased dramatically over the past few years.  Users, ranging from the novice to the IT and QA professional, have many options open to them to better protect their systems in a game where there are no winners and no end.  Robert Konigsberg will present information on Spyware, where it is today and the various forms that it can take.  In additional, he will cover:

Speakers: Robert Konigsberg, Founder of Network Evaluation
Robert is the founder of Network Evaluation, and has been involved in various aspects of computer security since 1992.  He has produced magazine articles, white papers, tutorials, guides and presentations aimed at informing and educating users on various aspects of networking and network and computer security.  Prior to starting Network Evaluation, he has worked for companies such as 3Com, Computer Curriculum Corporation, and Pearson Education, as well as following the Silicon Valley tradition of trying his hand in a few startups. He has earned SANS GSEC certification, and is an active member of the Center for Internet Security.


April 12, 2005

Solaris and Open Source - Current Status

Sun has announced plans to release the source code for the Solaris Operating System under an open source license, and to open the development process to external developers. This talk described these plans, the current status, and some of the challenges involved in moving a large commercial software project to an open development process.
Speakers: Andy Tucker and Keith Wesolowski

Andy Tucker is a Distinguished Engineer in the Operating Platforms Group in Sun Microsystems. He has been at Sun since 1994 working on a variety of projects related to the Solaris operating system, including scheduling, multiprocessor support, inter-process communication, clustering, resource management, and server virtualization. Most recently, he was the architect and technical lead for Solaris Containers, and is helping lead the effort to make Solaris available as open source. Andy received a Ph.D. in Computer Science from Stanford University in 1994.

Keith Wesolowski is an engineer in the OpenSolaris team within Sun Microsystems' Operating Platforms Group. He joined Sun and the OpenSolaris project in 2004 and has an extensive open source background, including SPARC and MIPS Linux ports and several smaller projects.


March 8, 2005

QA and Open Source - The Good, the Bad and the Ugly 

Sleepycat Software makes the Berkeley DB family of products and makes them available under a dual license model. That means that we're have both open source and proprietary licenses. Our technical team has extensive Open Source project experience and the roots of the Berkeley DB product came from the University of California, Berkeley. This presentation explored the Engineering and SQA challenges of managing an open source product in a distributed company.
Speaker: Dave Segleau

David Segleau has more than 22 years of IT industry experience, oversees Sleepycat Software's engineering, quality assurance and support operations. Segleau joined Sleepycat from Visto, where he was director of Quality Assurance and led the QA partnership with Handspring for delivery of the original TreoMail product. Previously Segleau headed up customer service for Asta Networks in Seattle and was senior director of Engineering Services at Versata and senior director for quality assurance and technical support at Informix and Illustra.


Febuary 8, 2005

War Stories from the Ground Level

This February meeting was a working meeting by SSQA membership. We started out with our annual topic, War Stories at the Ground Level. We wrapped up our meeting with a brainstorming session of 2005 Topics.
Speaker: Roundtable Discussion (SSQA Membership)


January 11, 2005

Tips for Managing an Offshore Team from Three Who Know

As more software IT work goes offshore, some local IT professionals are evolving their jobs to address the critical need for global project managers. In January we heard from three project leads/managers who are actively managing global [offshore] teams. They shared some of their success factors, challenges and concerns as they work within the new business model directed by executive management.

Speakers: Yana Mezher, Dave Weir, Dave Liebreich

Yana Mezher - test lead, Software SETT Corporation. Over the past eight years, Ms. Mezher has focused on developing online QA courseware, supporting testing projects with mentoring, team training and hands-on project management. She is currently managing a team based in Bangalore, India.

Dave Weir - consultant with Calavista managing several offshore QA projects for start-ups in Silicon Valley. Over the past 18 years Mr. Weir has worked with companies such as Keynote Systems, XUMA, KPMG/Triton Container International, Pacific Bell and Tandem Computers. He is currently managing an offshore QA partner based in Pune, India.

Dave Liebreich - QA Manager at Yahoo. Over the past 20 years, Mr. Liebreich has been involved in test management, test engineering and system administration working on a wide range of technologies and products. He is currently managing teams based in Sunnyvale and Bangalore, India.


December 14, 2004

Outsourcing in Software Engineering

In the field of software engineering, outsourcing various software projects< requires developing, implementing, and managing methodologies that ensure that the job gets done and produces results.  This talk presents a high-level overview of outsourcing and the countries that are the big winners in this new business model.  It will also discuss the overall impact and ramifications related to the software and technical professions.

Speaker: Sean Nihalani

Sean Nihalani, DSc, is the Director of the Engineering and Technologies Department at UCSC Extension in Silicon Valley. Dr. Nihalani has designed, presented and managed engineering, IT, and management courses at many universities and corporations. His 18 years of experience includes design, development, troubleshooting of LANs, WANs, hardware, software, network security and databases, as well as managing various engineering, IT, and financial projects.


November 9, 2004

Part 11 - Electronic Records and Electronic Signatures: Review of the Regulation and a Discussion of Issues

This presentation provided an overview of the Part 11 regulation. Major areas of non compliance routinely found in non specifically developed Part 11 compliant software were discussed. Issues that organizations face in complying with the regulation were outline. Several compliance strategies were presented.

Speaker: Tim Stein, PhD

Tim Stein founded Business Performance Associates (BPA), a Cupertino based consulting firm, in 1994. Tim has helped over 100 clients achieve Part 11 compliance, validate systems, implement business applications, or develop compliant quality systems. Tim has recently finished a manuscript for a book titled: Computer System Risk Management and Validation Lifecycle. The book will be published by Paton Press and scheduled for release next year. He is a frequent speaker on the topics of Part 11 and software validation.


October 12, 2004

Topic: Why Performance QA is Broken and How to Fix It.

Abstract not available.

Speaker: Damien Farnham

Damien Farnham is a Senior Manager for the Sun Microsystems Solaris Performance team in Dublin, Ireland.


September 14, 2004

Topic: SNMP: A primer

This presentation covered:

Speaker: William Estrada

William R. Estrada II has over 25 years experience as a System Programmer (Main frame and PC), Network Admin, Lab Manager, System Programming Manager, and Computer Operator. His on-the-job experience spans a wide range of operating systems (MVS, VS1, OS/2, DOS, Windows, Linux and Free BSD). His specialties are problem solving, automation, scripting, networking and last, but certainly not least, SNMP.


August 10, 2004

Topic: Staying Relevant in a Competitive Market

This presentation covered:

Speaker: Peter Yarbrough

Peter Yarbrough is a QA professional with over seven years experience working on many large-scale IT projects for Fortune 500 companies. He has acted as QA lead or manager as well as QA engineer testing web applications and shrink-wrapped software products. Most recently he moved into technical support working with a small team acting as liaison between development and on-line support. As a QA professional who has had to adapt to a changing technical landscape, Peter brings a unique perspective to staying employed, staying engaged and relevant in the QA profession. He studied Engineering at Santa Clara University and holds a degree in Technical Communications from De Anza College.


July 13, 2004

Topic: Market Based Job Searching

This presentation covered:

Speaker: Merrin Donley

Merrin Donley is a Career Management Specialist with over twelve years experience working with diverse industries in Silicon Valley. Most recently she is working for the Silicon Valley Workforce Investment Board at Campbell One-Stop assisting recently laid-off workers in their job search.


June 8, 2004

Topic: Test Variables Impacting Wireless Applications

Here is a familiar test scenario. You are logging into an account. You enter in the correct account number and password. The logon authentication fails. You try again but this time you are careful - you are absolutely certain you have entered the correct account number and password. The logon fails again.

Did you check your headset? Headset! What does a headset have to do with a logon failure? In a wireless world, accessories can sometimes cause distortion or interfere with the sending of correct tones through your telephone or other wireless device. There are many other unique factors that can impact results when testing applications designed for a wireless environment.

This overview can help you enrich your test scripts and be prepared for the surprising results you sometimes get when testing wireless applications.

Speaker: Gail Lowell

Gail Lowell is the Product Manager and acting QA Manager for the Unified Communications solutions business unit of InPhonic, Inc. InPhonic is a leading provider of communication software and services. Gail has several years experience managing the introduction and release of complex software applications in the US and internationally. She has industry experience in Unified Communications, Wireless Phones, Warehouse Logistics, Semiconductor Manufacturing, Human Resources, and Insurance software applications. Gail has been working with unified communications and wireless devices for the last four years.


May 11, 2004

Topic: Security Testing

In these uncertain times, security has taken on a new significance. Homeland security, white collar crime, corporate espionage, and new legal regulations drive forward the increased need for security. Customers, users, and the government expect and may contractually oblige you to deliver a secure application.

What is a secure application? What makes something "secure"? Will security testing keep my companies' name unsullied? Will security testing turn my staff into a group of evil hackers? How can I plan for, execute, and validate security testing efforts?

This discussion overviews what security is and how you test it. The presentation provides examples of what to look for and why. Security at both the conceptual and technical levels is covered.

Speaker: Rhonda Farrell and Jason Reid

Jason Reid is a test engineer at Sun Microsystems working in the Software System Test group. He has also been an SQA engineer in the Developer Tools group. Prior to joining Sun, Jason worked at the Purdue University Computing Center as an UNIX system administrator while obtaining his BS in Computer Science.


April 13, 2004

Topic: Realistically Estimating Test Projects

Question: When will the system testing be completed??? (Asked by an eager pest -- your boss -- with a tone of great anxiety.)

Note #1: At the time he asks this question, you do not know (a) the final scope of the functionality, (b) when the developers will deliver the final system for testing, and (c) what test resources you will have available.

Note #2: The boss wants a definitive answer and a drop-dead commitment from you in two minutes anyway.

Answer: Take a wild guess and multiply by two.

Question: What do you do when the boss cuts your agreed-on test duration by 85%??

Answer: Remind him that you thought he really understood that quality is important.

Let's face it: developing realistic and credible estimates is a critical survival skill for test professionals and managers. The word "estimate" is actually short for the saying: "Establishing Sloppy Time Intervals Makes for Agitated Test Engineers". This discussion overviews techniques which can help improve your estimating.

Speaker: Ross Collard

Ross Collard is president of Collard & Company, a consulting firm located in Manhattan, New York. His consulting assignments have included strategic planning for technology, managing large software development projects, improving software engineering practices, and software quality assurance.

Early in his career Ross was a hot-shot software engineer for Citibank in New York City. He first became interested in quality issues when he stayed up 48 hours straight trying to find a bug in his own code. During these same 48 hours the operational failure caused by his bug cost Citibank approximately 1,000 times Ross' annual salary. Fortunately for Citibank, this same loss only amounted to approximately 5 seconds worth of the bank's profits. Also fortunately for Citibank and the rest of the worldwide business community, Ross does not program much any more. But he sure does know how to test -- which is perhaps a more challenging skill than programming.

Ross Collard has conducted seminars on business and information technology topics for businesses, governments and universities, including George Washington, Harvard and New York Universities and U.C. Berkeley. He has lectured in the U.S.A., Europe, the Middle East, the Far East, South America and the South Pacific. He has a BE in Electrical Engineering from the University of Auckland, New Zealand (where he grew up), an MS in Computer Science from the California Institute of Technology and attended Stanford University's Graduate School of Business. He is writing a series of books on software testing and QA, at a gruesomely slow pace.


March 9, 2004

Topic: War Stories at the Ground Level

These short descriptions represent real events as related at this meeting:
Speaker: Panel Discussion (SSQA Membership)
Bio not applicable

February 10, 2004

Topic: Roll Your Own .NET Automated Tests

Despite a variety of commercial graphical user interface (GUI) test tools on the market, programmers often find themselves resorting to manual testing of their GUIs. Adopting commercial GUI-based regression tools requires developers to learn a whole new development environment and language. Furthermore, these tools are often expensive and may be overkill for what developers need. Fortunately, there is an alternative for programmers who need to test a .NET GUI: reflection. By using reflection in the .NET framework, programmers can send events to user interface elements without a separate, specialized tool. In this talk, Elisabeth Hendrickson demonstrates how to use C# with nUnit to simulate user events to test .NET applications through the GUI.

This talk is a preview of the talk Elisabeth will be giving at Software Development Conference and Expo West 2004 on March 18 in Santa Clara, CA.

Speaker: Elisabeth Hendrickson
Elisabeth Hendrickson is an independent consultant specializing in software quality, management, and testing. An award winning author, Elisabeth has numerous published articles and is a frequent speaker at major software quality and software management conferences. She has worked with and for leading software companies since 1988. You can reach her at esh@qualitytree.com and read more about her ideas on quality and testing at www.qualitytree.com.

January 13, 2004

Topic: A Case Study in Best Practices in Software Process Documentation: Space Station Software Project Measurement and Analysis

The Software and Data Systems (S&DS) Team of the (International) Space Station Biological Research Project (SSBRP) has recently achieved a CMMI Maturity Level 2 rating. Key to this achievement was mastery of the Measurement and Analysis (MA) Process Area -- one that did not exist in the previous CMM. S&DS applied the Practical Software Measurement (PSM) approach to tackle the MA process area requirements. This presentation illustrates best practices in the area of Process Documentation through examples from S&DS's Measurement and Analysis process. Included are insights into Process and Plan template creation and use, tailoring, and compliance. And of course there's the measurement process itself -- which has garnered praise from the CMMI appraisers and from measurement experts at the SEI.
Speaker: Rob Robason
Rob Robason is a Senior Process Engineer with Intrinsyx Technologies at NASA Ames Research Center, supporting Ames' SEPG, and the Software and Data Systems (S&DS) team of the Space Station Biological Research Project. S&DS achieved a CMMI Maturity Level-2 rating last October, and Rob was singled out and recognized for his contributions to the team's accomplishment, including his definition and implementation of the S&DS Measurement and Analysis process, critical to the CMMI rating. Rob has also improved software processes at Cisco Systems, Accugraph, and Hewlett Packard. He also has experience as a Systems Engineer and Software Development Engineer at HP, and as the Quality Manager at Accugraph. Rob earned a BS in Electronic Engineering at Cal Poly, and has done graduate work in Computer Science at Colorado State and UC Santa Cruz and in Industrial Engineering at Texas A&M. Rob can be reached at rob AT robason.net.

December 9, 2003

Topic: Capability Maturity Model for Software (CMM)

This presentation will provide an overview of the SEI Capability Maturity Model (SEI/CMM). Topics will include discussion of the purpose of the SEI/CMM, the five levels, the importance of metrics, and how the SEI/CMM framework can be used to understand and improve the quality of software development.
Speaker: Jeff Jacobs
Jeff Jacobs has over 20 years experience in software development, with a focus on software development methodologies and practices. His management experience ranges from leading development for startups to overseeing multi-company efforts for communications satellite systems. He has consulted to numerous companies and has trained over 3000 students in Information Engineering and various modeling techniques. Jeff is the author of numerous papers and is a frequent presenter at technical conferences. He received his B.S. in Information and Computer Sciences from U.C. Irvine, where he was one of the co-authors of UCI LISP. Jeff is a consultant providing services in software process improvement, methodology adoption and tailoring, business/systems analysis and modeling. His web site is http://www.jeffreyjacobs.com.

Last Updated: 

Valid XHTML 1.0 Transitional