These slides for CloudOpen Japan 2013 (05-31).
http://linuxconcloudopenjapan2013.sched.org/event/b0994396a7b878793f22cc4a0c5b27b7
And, you can download the same at http://events.linuxfoundation.jp/events/cloudopen-japan/program/presentations .
2. Agenda
1. Overview of the OpenStack
2. Overview of Compatibility Issues
3. Overview of the OpenStack Tests
1. Unit Test
2. Integration Test
4. Overview of Scenario Tests
5. Proposal for the Implementation at Havana Summit
6. Current Implementation of Scenario Tests
7. Current Status
1. Tempest
2. My proposal
8. Wrap-up
9. Appendix
Page 2
3. Overview of the OpenStack
▌One of the most popular OSS IaaS infrastructure
Software
▌Consists of several loosely-coupled components
▌Many features are being evolved with six month release
cycle
Page 3
Version up compatibility is one of the greatest
concerns for user’s viewpoint!
4. Release Cycle - Overview of the OpenStack
▌Many Features are being developed with 6 months release
cycle.
Page 4
Now
Now
6months
6months
6months
6months
2013.2
6months
5. Overview of Compatibility issues
▌Databases
There are many changes between Every Release Cycle
• ex) 35 DB schemas were changed, and 33 new DB Tables were added(*)
OpenStack components have database migration mechanisms.
Nova, Glance and Cinder have a test framework for this issue.
• In Nova, this framework discovered a data loosing bug.
• Other components need volunteers.
Page 5
* between Essex and Folsom
6. Overview of Compatibility issues
▌Configurations
OpenStack components have many configuration setting parameters.
In the public document, the number is about over 600! in Nova only.
• However, I think there are many undocumented configuration parameters.
The number of Configuration changes is over 130!(*)
Page 6
* between Essex and Folsom
http://docs.openstack.org/trunk/openstack-compute/admin/content/list-of-compute-config-options.html
7. Overview of Compatibility issues
▌APIs
APIs are versioning in the OpenStack.
Same version APIs should have backward compatibility.
However, most of APIs parameters are not validated.
• This can caused compatibility issues by the fluctuation of input values.
The comprehensive and more strict input validation can avoid this
issue.
There are some works in this area.
These works are very nice! And we need to test for ensuring the
compatibility.Page 7
for Nova for Cinder
8. Overview of Compatibility issues
▌How can we ensure compatibility? - Source Code Review?
Yes! We are already doing on https://review.openstack.org.
All commits are reviewed by core reviewers
Page 8
9. Overview of Compatibility issues
▌How can we ensure compatibility? – Testing?
Yes! We are already doing it.
Manually? No! Manual testing is painful!
• Jenkins does it: https://jenkins.openstack.org
• All commits are tested with Jenkins CI.
Page 9
10. Overview of Compatibility issues (Development work flow)
▌Gerrit Workflow Quick Reference
Page 10
https://wiki.openstack.org/wiki/File:Contribution_path.png
OpenStack Community Environment
approve
merge
By Sdague
11. Page 11
Tests are very important!
http://www.flickr.com/photos/sidelong/246816211/By David Bleasdale:
12. Overview of the OpenStack Tests
▌Basically, test cases are always tested automatically
by the community.
-> Developers need to write the test code for their
feature implementation.
We have two types of the tests.
Page 12
Integration TestingUnit Testing
By David Goehring : http://www.flickr.com/photos/carbonnyc/6415460111/ By INTVGene :http://www.flickr.com/photos/intvgene/370973576/
13. Unit Test
▌All Developers must write Unit Test codes.
The test code makes the implemented code behavior clear.
And, ensures the feature code quality.
▌Tools
testr/nose
nose
• is nicer testing for Python
• extends unittest to make testing easier.
testr
• will run tests in parallel(so they go faster)
• it keeps robust logs of the results.
Jenkins
• Continuous Integration environment.
• Jenkins runs unit tests every ‘git review’ing.
Page 13
14. Integration Test
▌Tempest
Tempest is the OpenStack Integration Test Suite
Runs and validates on the every ‘git review’(*) automatically
Page 14
(*) OpenStack Community uses the gerrit system.
15. Integration Test - Tempest
▌When the test is failed, you get ’-1’ from Jenkins.
▌Then, you need to fix your code, and do ‘git review’
again
Page 15
16. Hard to
impl, 9.6%
Unnecessary, 25
.4%
Implemented, 36
.3%
We've
implemented, 8.3%
Suspended, 7.1%
Ongoing, 13.3
%
0
50
100
150
200
250
#ofAPIs
Test coverage of Nova APIs with Tempest
Status of Integration Testing
▌Test coverage of APIs(Nova)
Page 16
A little more! About 20% APIs are remained
Still not Tested
(Our research at 2013-05-24)
17. Overview of Scenario Tests
▌What are Scenario Tests?
Testing across the components such as Nova, Keystone, Glance and so
on.
Top-down testing from the user’s perspective.
Page 17
18. Overview of Scenario Tests(continued)
▌Tempest tests until Grizzly
Page 18
Network tests
• create keypairs
• create security
groups
• create networks
:
Block Storage
tests
• create a volume
• get the volume
• delete the
volume
:
Compute tests
• create a
keypair
• create a
security group
• boot a instance
:
Identity tests
:
Image tests
:
Individual component based testing
Object Storage
tests
:
19. Overview of Scenario Tests(continued)
▌What are Scenario Tests?
Page 19
Scenario 1
1. create a flavor
2. create a image
3. create a
network
4. create &
configure a
project, a
quota, a
role, a user
5. create a
keypair
6. boot a instance
7. list & show the
instance
8. create a volume
9. list & show the
volume
10. attach the
volume
: Across the multiple components & sequential
testing
20. Overview of Scenario Tests(continued)
▌Effects of Scenario Tests
For developers: We’ll be able to check whether new code (bug-fixes or
features) will cause side effects on the other components with the
top-down method.
For users: We’ll be able to increase the use case coverage which is
important from the user’s perspective.
We’ll be able to cover the user’s use case by adding various
scenarios according to the usage scene.
▌Points to Consider for the Scenario
Category of the scenario: Private cloud, Public cloud, VPC and so on
Scale: # of tenants, users, networks and so on
Validation method: REST API, ping, ssh and so on
Page 20
21. Proposal for the Implementation at Havana Summit
▌The Test Scenario of the First Implementation
Basic
Minimum
Across the multiple components
▌Access Client and Directory Options
Page 21
No Access Client Directory
1 CLI (cli.ClientTestBase) tempest/scenario
2 Client Library tempest/tempest/tests/scenario
3 RestClient tempest/tempest/tests/scenario
22. Current Implementation of Scenario Tests
▌The Test Scenario of the First Implementation
Basic
Minimum
Across the multiple components
▌Access Client and Directory Options
Page 22
No Access Client Directory
1 CLI (cli.ClientTestBase) tempest/scenario
2 Client Library tempest/tempest/scenario
3 RestClient tempest/tempest/tests/scenario
<- The concepts are not changed.
23. Current Implementation of Scenario Tests - Detail
▌Basic – All OpenStack Administrators can understand
▌Small – Main code is only 208 lines
▌Besides, testing across the multiple components
Page 23
https://review.openstack.org/#/c/26403/
25. Current status of my proposal
▌Blueprint
Not approved yet..
▌Code
Core reviewer reviewed and +1ed. Almost merged!
Page 25
26. Wrap up
▌The tests are very important for ensuring compatibility
Especially, scenario tests are very important for user’s perspective.
▌The scenario test directory is ready now
▌Please share your scenario tests.
-> That makes all of us happy.
Page 26