[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: automated storage test framework


On 05/10/2010 10:53 PM, James Laska wrote:
On Mon, 2010-05-10 at 15:05 -0400, Chris Lumens wrote:

Over the past few weeks, I've been hard at work on creating an automated
storage test framework for anaconda.  We came up with this concept
sometime around FUDCon Toronto, but it's just finally come together.

This framework provides a way to automatically run a kickstart
partitioning snippet and validate that it does what you intended it to.
I currently have it running against the latest anaconda package in
rawhide, but there's no reason it couldn't instead be run against the
git repo.  We'd just have to do scratch builds beforehand.

Running against F13 and earlier is impossible since it requires my
modular anaconda patches.

I also don't see any reason why this couldn't be even more automated -
instead of requiring you to kick off a run, we could easily script it to
happen every time there's a new anaconda build provided we have the
spare hardware to do so.  If we decide to do that, I'll have to make
results reporting fancier as it's just logging to a local directory for

My current strategy is to go through the partitioning section of the
Fedora test matrix (http://fedoraproject.org/wiki/Test_Results:Current_Installation_Test)
and convert all those into test cases.  Once we've got that done, we can
start adding test cases to check very specific pieces - ignoredisk
behavior, what happens when you have two existing disks with conflicting
LVM metadata, conditions from a single bug, iSCSI, whatever.  It's
really pretty flexible.

Currently, this code doesn't live anywhere besides a directory on my
computer.  It's not in a local anaconda git branch.  It's not in autoqa.
Is there a good place for this stuff to live, or is it destined to be
off on its own?

Committing this to AutoQA seems appropriate to me.  I can see this
living alongside other installation related tests in AutoQA.  Not sure
if you have commit privs, but we can certainly fix that.

Or maybe just make it part of the tests dir in anaconda git, see below.

What frequency do you anticipate having these tests run?  Every new
anaconda build?

I would really like to see these tests be run as part of a build, there
is a reason a spec file can have a %check section, because test cases failing
is a very valid reason to abort a build.

I'm not sure how feasible this is though, I guess that for building the
livecd repo and thus network access is needed, and it will take quite
a bit of resources too. If others agree it is desirable to run this at
anacona (package / rpm) build time, we could chat to the infrastructure
people about this.

What does everyone think?  Are my test cases too picky?  Not picky
enough?  Is there something obviously stupid that I'm doing?  I'd like
to get some help plowing through the test matrix before I open this up
to the world at large to play with.

This is amazing stuff!  Well done :)

I have to second that, great work!




Are you aware of the scsi_debug module ? That allows you to create fake scsi
disks with very precise parameters. I guess you can probably make them identical
enough to make multipath tools think they are a multipath. This would allow
tests for things like partition aligning (faking 4k physical sector disks), etc.

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]