Using VUGEN to create input data for load testing

606 views
Skip to first unread message

Sam

unread,
Aug 9, 2011, 11:31:20 AM8/9/11
to LR-Loa...@googlegroups.com
Hi all,  We are new users of LoadRunner and just getting started.  I know we should get training but the company is not willing to do that.  So I am learning from the help, internet groups such as these and whatever else I can find. 
 
Got most of what I need worked out for loadtesting, we mostly are doing ODBC to DB2 right now.  Pretty straight forward. .
 
What I have not found is a good solution for is not a load test but using LoadRunner and Vugen ODBC scripts to create input files for loadtesting.   What we would like to do is run a query, save the result set in a .csv or .dat file.   We manually have to datamine this now, fairly frequently and is very time consuming.
 
That created file would be used for input during loadtests.  (after suitablely randomizing and other massaging of the data as needed). 
 We would only run each script once to extract the data and save it.  But we have hundreds of files to extract. We know HP and others have tools to do this, but we can not get those.  So using a LoadRunner test with hundreds of datamining scripts could save us time.  I would set it up to not run all at the same time.
 
I have found lrd_save_col but so far have not found a reasonable way to make that work for thousands of output rows.  I got it to work for a few rows, but I need thousands.  I put c code in the script to write to the file using the variables created in lrd_save_col.  But since it has to be defined for each column and row, not realistic.
I also got some info from HP on using the lrd_fetch parm that causes it to print to files using the print.ini and other files.  We got it to work but it is VERY slow and for hundreds of scripts is time consuming to build, at least with what I know now.
 
so, after a too long first post, does anyone have a better way to do this?  You can ignore it is DB2 most likely, it is standard ODBC, not DRDA.
 
Thanks in advance for your helpful suggestions and comments ! 
 

Dan Franko

unread,
Aug 9, 2011, 11:54:27 AM8/9/11
to LR-Loa...@googlegroups.com
I'm not 100% clear on what data you're creating/mining, but one approach might be to write a program in some other language like java/c#/vbs/etc... and make a system call from your vuGen script.  You just have to create something executable out of these like a executable jar, or exe file.  Then call it at your init action and read the file(s) that was created.  I also think you might be able to do this with a custom dll, but I personally don't have experience with that yet.

Ruslan Kholyavkin

unread,
Aug 9, 2011, 2:48:27 PM8/9/11
to lr-loa...@googlegroups.com

Sam,

I am not sure if I understand you correctly ,but what you need it ability to run query set it as input parameter for load test ( like for example "userid" or other parameters you need to use in scenario)  If this is correct please  take a look on Load Runner User Guide section calls "Import Data from existing Databases " what you need to install client version of db type you try to use ( Oracle , DB2, SQL Server )  to have a driver and using ODBC create new Data Source Name user DNS or System DNS  make sure that connection setting is correct and you able to access proper db ( test it)  . go to your script create new parameter in your script ( paramer.bat) -- you can make this parameter global - not only one script -  multiple scripts will able to use same parameter ( depend on you scenario logic and needs)   and  use Data Wizard to create Query and connect to DB using DNS connection you created and test. if  your  connection and query valid you will get data you need, which will be part of .dat file and in same time you will able to manipulate this data using all manipulation  abilities  provided for any  build –in  parameter in Load Runner

 

I hope it is help

 

Ruslan


--
You received this message because you are subscribed to the Google "LoadRunner" group.
To post to this group, send email to LR-Loa...@googlegroups.com
To unsubscribe from this group, send email to
LR-LoadRunne...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/LR-LoadRunner?hl=en

Sam

unread,
Aug 9, 2011, 3:53:21 PM8/9/11
to LR-Loa...@googlegroups.com
Thanks Dan, we had thought of that, but we thought it would be much simpler to use VUGEN and templates than getting programmers to write lots of special programs.  But that is our fall back plan and we can do that outside of LoadRunner.   It is just hard to get the REAL programmers to write and maintain this type of stuff here. (I am a program killer, can do a little bit of everything, good at nothing)   Not your issue, but it something I have to live with.
 
 Thank you very much for the reply!  We might end up having to do it your way.   I saw the next reply had something interesting,...
 

Sam

unread,
Aug 9, 2011, 4:17:10 PM8/9/11
to LR-Loa...@googlegroups.com, lr-loa...@googlegroups.com

Thanks Ruslan, I have entirely missed that section and don't know anything about it or the Data wizard tool you mention. (unless that is what you use in VUGEN to do variable substitution. ) ... it sounds interesting and I will go look for it.

 

What we are trying to do, is extract data from our customer data bases, that then can be used ( at a future time) as input to stored procedures we use in our big LoadTests.
We would not run these data extraction scripts during the loadtest, but as a step in getting ready for a load test. 
 
This currently takes many weeks of manual effort.   Our data and stored procedures change quite a bit. So we are constantly having to refresh the data. 
 
 

Sam

unread,
Aug 9, 2011, 5:40:47 PM8/9/11
to LR-Loa...@googlegroups.com, lr-loa...@googlegroups.com
ok, I did know about that and it is better than what we do now.  On the fall back list for sure!
 
  That is in general what we want to accomplish, but it appears that we would still have to manually refresh when needed, or is there a way to tell Vugen to refresh all those parameter files ?  Before a loadtest is needed.
 Since I have to refresh hundreds of data files, I would like to build an automated method. 
 
But great ideas!   So close....
 
Thanks!

Nishant

unread,
Aug 10, 2011, 1:03:06 AM8/10/11
to LoadRunner
Hi Sam,

Use the Database functions present with "Web Services" protocol.
Record scripts with multiple protocol (including we services) and then
use Database function for retrieving and saving data into files. Use
the saving file as a parameter file for next step or transaction.

For more info on Database functions use LR help:
HP LoadRunner Online Function Reference > Utility Functions: C
Language (LR) > Database Functions

Please reply.

Thanks
-Nishant

Ruslan Kholyavkin

unread,
Aug 9, 2011, 6:50:35 PM8/9/11
to lr-loa...@googlegroups.com

Sam ,

I think this option is useful for you even if you try to create bat file based on SQL  and copy it ( bat file in other location  - manual work ) and use it later in test - in fact you can have library of this kind of bat files based on version - in case if you need to comeback to old versions. One more point based on Load Runner Parameters functionality you can easily change data format store in you generated bat file. At list it will save your time for test preparation

 

Thanks,

Ruslan


 
 

--

James Pulley

unread,
Aug 10, 2011, 9:30:58 AM8/10/11
to LoadRunner
The below suggestion for dynamic data pull from the server is an
absolutely wrong suggestion for a performance test script. Such
models work very well for functional testing but in a performance test
model this results in

* an explosive growth in number opf database connections to the server
because in most web models the user does not have a 1:1 connection
with the web server and instead a pooled set of connections is used to
push/pull data from the web or application server
* Queries that are not present in a production model are now being
used to extract data exerting a great additional load on the system.
Such queries are rarely optimized and tend to cause horrible
performance problems.
* Your load on the server explodes from all of the addtional queries

James Pulley, http://www.loadrunnerbythehour.com/PricingMatrix

James Pulley

unread,
Aug 10, 2011, 9:43:45 AM8/10/11
to LoadRunner
Ahh, now we get to the core of the matter, environment initial
conditions.....

Have you considered setting a restore point in your database prior to
your test and then rolling back to that restoration point before your
next test? This would allow you to have exactly the same data in
your database, including the amount of data, for each test run, thus
ensuring a consistent initial condition for your test. This would
also allow you to re-use your data from one test run to the next,
eliminating the time consuming step of repopulating data files.

It is really going to come down to which is easier from a time
perspective, repopulating your data files or restoring the database.
If you don't restore the database to the same state then you will need
to make a note of the constant shift in environment initial conditions
which will have an effect (likely an unknown effect) on test integrity
as this violates some 'testing 101' process issues.

If you know your queries to produce the data there is likely a simpler
way than using vugen. Most database servers come with a full set of
command line shell tools which would allow you to execute a query and
export the results in a CSV format for use in other tools. So you
should be able to create a batch file which executes your queries and
full populates all of your data files in different directories on a
very straightforward basis, all independent of VUGEN. One item you
should take a note of here is to watch the data volumes you are using
for your data files as all of these files get loaded into memory at
the inception of a test. I can recall one old customer who decided
to use a CSV version of the local phone book to allow their script to
pick random names, addresses and phone numbers....for 500 users....on
one box... with no ramp up. They did not get very far with this idea
the load generator (which was also the controller in this case)
totally seized up requiring a hard restart. So, if you only need a
100 sets of data be sure to qualify your query results for 'top 50' or
other qualifier for your SQL grammar.

My recommendation, try to go for the database restore first and to
reuse your data. You should have some string testing process
arguments to make regarding the integrity of your test relative to
environment initial conditions, which includes the size and complexity
of data in your database. If you have to drop back to the
repopulation of your data for each test then see if you can take
advantage of the command line query interface for your database server
solution in a batch method to automatically rename your old data file
scripts and to create new ones in the same format as what you will
need for your testing effort.

James Pulley, http://www.loadrunnerbythehour.com/PricingMatrix

Sam Rudolph

unread,
Aug 10, 2011, 10:13:54 AM8/10/11
to lr-loa...@googlegroups.com, James Pulley
Thanks James!
For a particular series of tests we do reuse the data and use restore
points as we can. I should maybe mention we have been doing load
testing for years with another/lesser tool.
After a few months there are so many changes to the production databases
(both DB structure and data) and stored procedures, we have to refresh
the test databases with new copies of production, and recreate the input
data.
Your idea of using the database tools is good, we have a number of them,
and what we do now. There are non technical local issues .... Maybe
we are/were trying to create something that maybe we should not create.
It sounded good in my head since I have other internal customers asking
to save result sets for other needs, but those are very small result
sets that I can handle.


We normally run ramp up tests that range from 30 min, to 1 hour, to
over 4 hours (not frequent), and have anywhere from a few hundred input
rows to around 50k. We try not to repeat some queries to avoid DB caching.
This loading input into memory I will have to watch closely. On the
bright side, the programmers are changing the application so that some
of the heavy hitter stored procedures will go away. I hope.

Ruslan Kholyavkin

unread,
Aug 10, 2011, 11:31:45 AM8/10/11
to lr-loa...@googlegroups.com

Sam ,

Restore points and  transactional / definition data sets   this is related to process of repeatable  testing your  test db as soon as db is restored based on latest production version .

But I think we missing one more piece of information. 

Based on the info you provided you dealing with multiple production db's. Is this same system ( some kind of multi tenancy system or SAAS) ?or is this customer specific production db's ?. If it is customer specific db does they all (customer) use same version of the product or the version are different?

My Point is are we dealing  with customer specific input data or this data or general input data which need to be rebuild or recreate for the test

 

Thanks,

Ruslan

 

P.S  Sam sorry for try to get more info it is very interesting topic it also  may help with  my process as well.

 


 
--
You received this message because you are subscribed to the Google "LoadRunner" group.
To post to this group, send email to LR-Loa...@googlegroups.com
To unsubscribe from this group, send email to

Sam

unread,
Aug 10, 2011, 6:52:52 PM8/10/11
to LoadRunner
Hi Ruslan,

Not a problem... probably more info than you want but here is a brain
dump...

This is a home built database and application used to run our company,
a Electric and Gas utility.
The OS is z/OS, IBM mainframe.
The DB is DB2, IBM's database.
There are other OS's and Databases but not worried about them right
now.

We use a third party ODBC driver to access DB2, Shadow Direct. It
looks like any other Windows ODBC driver. It is not DRDA protocol.

There are multiple Production copies of the DB2 databases for
different parts of the company. One of those is far larger and used
far more than the others. This is what we copy.

We have the ability to copy that largest and most used database to a
test LPAR (Logical partition- mainframe VM image) and reload it into
another instance of the database that is only used for LoadTesting.
Then this loadtesting copy does not get updates by customers ever.
It is setup exactly like the production database, same buffers, disk
space, everything except the name.

We then build load test scripts to run against that loadtesting copy
of the database.

We can move the database to different mainframes with ease, it just
takes a few minutes. They all share the same disks but each copy of
the Database is on its own disks. All the LPARs can get to all the
disks, but each database instance only uses its subset of disks. . So
we can develop our test on a small machine and move/run on a bigger
machine for the larger tests, on weekends. We do not touch the
production databases with our testing. (bonus info, does not affect my
getting data files quest)

While we are running tests for several weeks or months, the production
database is still being changed by programmers, DBA's and updated by
many people, customers are being added, deleted, updated. And lots of
other stuff is going on. This production activity is not being done
on the loadtesting database. It only gets updated by loadtesting,
until the next refresh.

Checkpointing/Restore points can be done in various ways. The disks
have multiple behind the scene mirror copies. There is a way to take
one set of mirror disks and save it at a point in time, and we can
then restore to that point in time after testing. We rarely do it,
we are trying to improve this process, it takes about the same amount
of time to restore from regular backups. It should not.

There are also ways to make the DB2 database roll back to a point in
time. Again, in our loadtesting case, it is faster to just restore
the backups. Sometimes we don't need to restore as we try to avoid
"non-repeatable" update stored procedures when we have to do frequent
tests. This helps alot, but not always possible.

At some point, we finish a series of loadtests. And when the next
loadtesting is needed, we start it all over again. Copy the
production database, restore to the loadtesting database, run test
after test after test...

and repeat.

In our old loadtesting tool we had over 500 scripts for DB2 stored
procedure. Lucky for me, they don't all change at the same time. So
I don't have to start over from scratch for each cycle. Most of them
will continue to work. I will have to update a number of them but it
is not hard to do a a few.

We are really just a few weeks into LoadRunner and I am trying to get
things setup. Only have a handful of scripts so far, but the push to
build hundreds is about to start. oh my poor eyes. I will be using a
script template but I am also sure I am not doing everything the best
way that LoadRunner can do it. But I will learn!

The hard part is after building a new copy of the loadtesting
database, it now has different data in the tables. So a lot of my
input data is no longer valid. So we have to run lots of utilities to
get new input data, format it, test it, clean it, etc. There are
hundreds of tables. We could use the batch job scheduling system but
that is WAY too painful to get setup.

So, I was thinking I could use Vugen to create sql query scripts to
data mine the input for the loadtesting scripts, and save those
results in files. Then I could use LoadRunner to run the hundreds of
data mining scripts (staggered), only one execution for each one. Not
load testing, just building the input files for the stored procedure
scripts.
It is looking like maybe not a good idea right now.
> >> James Pulley,http://www.**loadrunnerbythehour.com/**PricingMatrix<http://www.loadrunnerbythehour.com/PricingMatrix>
>
> >> On Aug 9, 5:40 pm, Sam<srudolp...@gmail.com>  wrote:
>
> >>> ok, I did know about that and it is better than what we do now.  On the
> >>> fall
> >>> back list for sure!
>
> >>>   That is in general what we want to accomplish, but it appears that we
> >>> would still have to manually refresh when needed, or is there a way to
> >>> tell
> >>> Vugen to refresh all those parameter files ?  Before a loadtest is
> >>> needed.
> >>>  Since I have to refresh hundreds of data files, I would like to build an
> >>> automated method.
> >>> But great ideas!   So close....
>
> >>> Thanks!
>
> > --
> > You received this message because you are subscribed to the Google
> > "LoadRunner" group.
> > To post to this group, send email to LR-Loa...@googlegroups.com
> > To unsubscribe from this group, send email to
> > LR-LoadRunner+unsubscribe@**googlegroups.com<LR-LoadRunner%2Bunsubscribe@go­oglegroups.com>
> > For more options, visit this group at
> >http://groups.google.com/**group/LR-LoadRunner?hl=en<http://groups.google.com/group/LR-LoadRunner?hl=en>- Hide quoted text -
>
> - Show quoted text -

Sam

unread,
Aug 11, 2011, 8:43:01 AM8/11/11
to LoadRunner
And after waking up this morning... I got to thinking about the
suggestion to use the "Import Data from existing Database" to build
the input.
I was not thinking it through I guess.
I think James said it refreshed that data at run time, and I did not
want to add that to the loadtest....but I could have
non-"loadtest"ing scripts that just rebuild the data/variable files, a
different spin on the same technique I was looking for. I could then
use the extracted files in the real loadtest.

At least that is something to look into ...I will have to read up on
it again. and try it out.
I will look into that. Sometimes I am quite slow at catching on.


I might still need to write out a file for other things, but this
might do it.
Thanks guys!
> ...
>
> read more »

Ruslan Kholyavkin

unread,
Aug 11, 2011, 2:11:37 PM8/11/11
to lr-loa...@googlegroups.com

Sam,

There is one more solution to think of

Jmeter  ( free  tool)  will  do most of it for you  but it will save result file as in XML format ( see example in  end o message)    so you will need to create program ,which will remove  XML part of the return values:

 

If it is look  interesting pleaes  take a look on following :

 

http://jakarta.apache.org/jmeter/usermanual/build-db-test-plan.html

 

you can  run one simple test and put  JDBC requests one after other  or concurrently  and each of them will generate   similar to provided below   XML file which will be store in Disk  as unique  file ( neeed some small addition work on it)  . Next time you need to run it will execute query you define and rewrite result files

 

( I just give you overview if you think it will be useful for you let me know i can get you more details about it )

 

 

This is example of the store in file result include XML

?xml version="1.0" encoding="UTF-8"?>
<testResults version="1.2">
<sample>
  <responseData class="java.lang.String">ServiceID Name AreaID SupplierID BillingTypeID IsSpecialOrder IsEntitlement SupplierPartNumber ManufacturerName ManufacturerPartNumber PublicationDate ExpirationDate DisplayOrder Description DescriptionURL ExpectedDuration ExpectedDurationUnits PricingSchema EstimatedCost HasPortal1 PortalText1 PortalText1URL HasPortal2 PortalText2 PortalText2URL HasPortal3 PortalText3 PortalText3URL RevisionNumber ExpenseCode HomeCategoryID OverrideRoles ProcessID BillingRate IsInactive DocumentID PriceDisplaySchemaID OfferPriceCurrencyID PriceDescription IsPriceOveridden OverideOfferPrice CanStartLater IsBundle CannotBeBundled TenantID CreatedOn CreatedByPersonID ModifiedOn ModifiedByPersonID GUID DateQualityID
1 Base Service1 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   10  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:33:49.733 0 2011-07-06 18:42:51.037 0 CFC5554D-A87B-4385-9515-78F46494A294 4
2 Base Service2 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   3  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:33:54.427 0 2011-07-06 19:03:01.97 0 81A7BFBC-CFE7-4C64-BADC-49B24C818253 4
3 Base Service3 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   3  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:33:57.123 0 2011-07-06 19:43:17.487 0 8F9D38D7-A69A-4CE3-8E26-1C8A5C107402 4
4 Base Service4 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   3  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:33:59.503 0 2011-07-06 19:47:40.99 0 AF080CDD-5937-4833-9D80-16CF56F9D58F 4
5 Base Service5 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   3  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:34:01.89 0 2011-07-06 19:50:39.053 0 8EC43DE5-0F72-47E5-B5B9-AFB5976183B2 4
6 Base Service6 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   3  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:34:04.263 0 2011-07-06 19:52:01.303 0 647280CB-5ACE-4D36-8C23-28BAE4BEA70D 4
7 Base Service7 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   0  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:34:06.657 0 2011-07-06 20:59:59.683 0 AF4AD49A-BD4F-4104-AD83-F940E96D4F40 4
8 Base Service8 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   0  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:34:08.993 0 2011-07-06 21:00:52.95 0 2005FA99-E290-491C-A644-15FB81A38520 4
9 Base Service9 1 0 1 false 0      0   0.0 hours 0 0.0000 false   false   false   0  0 1 0 0.0 0 0 0 0  false 0.0000 false false false 1 2007-09-04 18:34:11.283 0 2011-07-06 21:05:42.267 0 94361A5E-BE25-440E-BE70-E2625B759520 4
</responseData>
  <samplerData class="java.lang.String">[Select Statement] select * from DefService where ServiceID &lt; 10;
</samplerData>
</sample>
</testResults>
 
 
Thanks,
Ruslan


--
You received this message because you are subscribed to the Google "LoadRunner" group.
To post to this group, send email to LR-Loa...@googlegroups.com
To unsubscribe from this group, send email to

For more options, visit this group at

Ruslan Kholyavkin

unread,
Aug 11, 2011, 2:20:31 PM8/11/11
to lr-loa...@googlegroups.com

Sam ,

Also please as soon as you test refreshing of db result during run time Please let us know if it actually work. I also have to deal with huge db's sizes  and data  sets and did not have chance  to figure it out , does it actually refresh or keep initial value. Will be good to know .

Thanks in advance

Ruslan

On Thu, Aug 11, 2011 at 5:43 AM, Sam <srudo...@gmail.com> wrote:
--
You received this message because you are subscribed to the Google "LoadRunner" group.
To post to this group, send email to LR-Loa...@googlegroups.com
To unsubscribe from this group, send email to

For more options, visit this group at

Sam

unread,
Aug 12, 2011, 7:20:02 PM8/12/11
to LoadRunner

Will do, got the Parameter screens to save a dynamic sql result set
using the table option, works great.
It did not like a stored procedure call, but I may have done
something wrong there.
On my first dynamic sql request, forgot to limit the result set, it
croaked at a little over 228k rows returned. But that would be more
than enough for me!

I did not have a lot of time to mess with it yet, I have not figured
out if it refreshes at run time or not.
Or if there is a manual way but it is a start! Thanks for the
suggestion and I will let you know what I figure out.



On Aug 11, 2:20 pm, Ruslan Kholyavkin <rkholyav...@gmail.com> wrote:
> Sam ,
>
> ...
>
> read more »

Sam

unread,
Aug 15, 2011, 8:43:19 PM8/15/11
to LoadRunner
Got a little time to play around with it.
I created a script that does nothing but display the value from the
DataBase parameter file.
Ran it in LoadRunner (not Vugen) with 4 users and 4 transactions
each.

The date on the file did not change from last week so I don't think it
updates the parameter file at execution time. I just remembered I
forgot to check the log on the mainframe side. I will do that
tomorrow.
I can tell if it reran the sql again for sure then. But pretty sure
it did not. Darn, I was reallllly hoping that would work.

I played around with the system() command a little too. I can get it
to run a dos batch file without any issue, so on to looking at
utilities to extract data, That may take awhile.
> ...
>
> read more »

Oliver Lloyd

unread,
Aug 16, 2011, 6:18:01 AM8/16/11
to LR-Loa...@googlegroups.com
I don't really see why you want to use a performance test tool like LoadRunner (or JMeter for that matter) to do this task. Where's the benefit? LoadRunner is useful because it allows you to run multiple threads, all concurrently executing the same scripts. But to refresh the data in a database there is no requirement for concurrency, so why use LR? There are a lot of other ways which are probably, simpler, better, faster and more reliable.

Really, this is a job for a DBA or one of the Application team. You'd do better clearly defining you requirements and why you think there is benefit in automating the process on paper and then presenting it to someone who can assign a resource to the task and get the work done.

Dan Franko

unread,
Aug 16, 2011, 9:46:24 AM8/16/11
to LR-Loa...@googlegroups.com
I haven't implemented this myself yet, but using popen might be a better alternative as opposed to system().  For your situation it might be appropriate to check and make sure that your program/sql commands executed properly.

Sam

unread,
Aug 16, 2011, 6:53:25 PM8/16/11
to LoadRunner
Thanks Dan!
I actually saw that yesterday and have tried to use it but no luck so
far. I found the syntax, just not got it right so far.
I have gotten a programmer to look into writing a batch job for me and
he thinks he can do it in about a month or so.
Then I can either do it in a bat job or using LoadRunner. Leaning
towards bat job now, the benefits of using loadrunner are disappearing
when I have to have a programmer involved.

Sam

unread,
Aug 16, 2011, 6:59:41 PM8/16/11
to LoadRunner
Thanks for your input. We are not trying to refresh the database, but
extract information from it. I was also hoping to use it as a
learning experience, which it has been.

LR, QTP, ST are tools I control and manage, among many others I have
access to. And I was looking for something better than a bat job.
What seemed like a really simple idea, has turned into a more time
than I have for it. There really should be an easy way, but ... there
aint.
If all you smart people can not figure it out, I am sure I can't
either.

Oliver Lloyd

unread,
Aug 17, 2011, 7:06:02 AM8/17/11
to LR-Loa...@googlegroups.com
There isn't an easy way. LR is never going to make this sort of job easier - it will do the opposite!

Seriously, the best thing you can do is to append to the process that you use to refresh the data - whatever or whomever you use to copy the data over from production should then also run some steps that give you the data extract you need. Really, this should be an automated process - this sort of thing is not hard to automate. Whatever tool or interface the refresh is done by can almost certainly be extended to give the extract.

Then, you make your side of things better by ensuring that the output of this refresh process goes to a location that your scripts dynamically read from - so that you remove any steps at your end. Automate the repetitive work and you will have more time to do the useful / fun work.

Sam

unread,
Aug 17, 2011, 10:32:55 AM8/17/11
to LoadRunner
... easy? .... obivously you don't work here!
You are right, that is the way it should work, but I have been trying
for YEARS to get that done and
it always comes back to everyone involved saying they don't want to do
it that way.
Reply all
Reply to author
Forward
0 new messages