Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Testing configuration changes in a service

21 views
Skip to first unread message

Noel McGrath

unread,
Nov 9, 2016, 3:33:09 AM11/9/16
to Continuous Delivery

What is the best approach to testing a service when u add new configuration.

 

For example my service offers a service to a customer and based on the customer configuration it will offer a different type of service. E.g. If the customer selects a particular currency they are offered a 20% discount compared to another currency.

The example above does not matter. What matters is the approach people are taken when doing CI\CD

The logic for working out the discount is in the domain and has unit tests around it. My question is if you have merchants configured with different rules to figure out the discount(all based of configuration and the domain works it out). Then if a request comes in to change the configuration how do you verify it?

  1. Do you write more tests
  2. Do u not test as already in unit tests
  3. Manually test changes
  4. Other

I have read xUnit Test Patterns and Test-Driven Development books along with many articles but have not come across how people manage this(configuration changes within service and verifying correctness)

I dont see this addressed in continuous delivery book either

Noel McGrath

unread,
Nov 9, 2016, 3:34:36 AM11/9/16
to Continuous Delivery
Would love to see best practise links on how people deal with configuration changes from testing and delivery point of view

Josiah Carberry

unread,
Nov 9, 2016, 3:47:21 AM11/9/16
to continuou...@googlegroups.com

If you have confidence in your unit tests, then you should not need to do any more testing.

If you do not have confidence in your unit test, then you need to rethink them. But in any case, you should not need to create tests specific to a configuration change (presuming that you are talking about changes like switching the base currency of the customer).

You wouldn't create a test each time a customer buys something, so why should changing their configuration parameters be any different?

--
You received this message because you are subscribed to the Google Groups "Continuous Delivery" group.
To unsubscribe from this group and stop receiving emails from it, send an email to continuousdeliv...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Noel McGrath

unread,
Nov 9, 2016, 4:34:30 AM11/9/16
to Continuous Delivery

ThanksRobert.
Do u known of any guidance in this area? I dont see any mention in books, articles on how to handle configuration changes 

Kief Morris

unread,
Nov 9, 2016, 4:47:53 AM11/9/16
to continuou...@googlegroups.com
Ideally, your automated testing will ensure that the code works correctly, including when different configuration options are set.

For more complex configuration, you might like to test that the configuration itself achieves what you want. And you may want to test that changes to the configuration don't break existing, expected behaviour.

The best way to achieve this is to design the system so that configuration is in code. That is, it's externalized in files, which can be committed to version control, and automatically applied and tested in a test environment, before being applied to a production environment.

What the pipelines for this look like will depend on your situation. For example, if you have a handful of customers, each with their own configurations, you might have a pipeline for their configuration files. Changes could be applied to a test environment running the existing production version of the application code.

This is hard to do the way many applications handle configuration, i.e. stored in an internal DB and managed through a UI. This makes it difficult to separate the configuration from a running instance, so you can't easily test a single change in a test environment and then promote it into production.



--
You received this message because you are subscribed to the Google Groups "Continuous Delivery" group.
To unsubscribe from this group and stop receiving emails from it, send an email to continuousdelivery+unsub...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--


My book, Infrastructure as Code, is available now!

Josiah Carberry

unread,
Nov 9, 2016, 5:02:37 AM11/9/16
to continuou...@googlegroups.com

I don't know of specific published guidance in this area. If it exists, I would look for the question of when changes to master data could be treated like a transaction and when they need to be treated more like a change in logic. But I suspect that such questions depend entirely on the structure of your solution, on the degree to which you have made implicit assumptions about the values of the configuration data. For example, if you build a system where you assume that base currency of a customer will not change, and then it changes, you are apt to need to review the processing logic and perhaps also your data structures. And that means adding tests. But if you have already built into your system the logic to handle such changes, then your normal testing (presumably automated) should be sufficient.


On 09.11.2016 10:34, Noel McGrath wrote:

ThanksRobert.
Do u known of any guidance in this area? I dont see any mention in books, articles on how to handle configuration changes 

Noel McGrath

unread,
Nov 10, 2016, 5:22:17 AM11/10/16
to Continuous Delivery
Thanks Kief and Robert for responses
To unsubscribe from this group and stop receiving emails from it, send an email to continuousdelivery+unsub...@googlegroups.com.

ghanajit .

unread,
Nov 14, 2016, 4:24:18 AM11/14/16
to continuou...@googlegroups.com
Why don't you apply combinational testing for configuration testing.
Yes, All kind of configuration testing should be done at code level i.e through automation. You can have simple switch or data driven test supported by mots of Automation tool

To unsubscribe from this group and stop receiving emails from it, send an email to continuousdelivery+unsubscribe@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Tim Coote

unread,
Nov 14, 2016, 6:22:33 AM11/14/16
to continuou...@googlegroups.com
I think that for large scale IoT deployments, combinatorial testing is a must, as is capturing from the production environment actual configurations that have been encountered.  The approach also requires either in-production testing, or software simulation for scale. I have seen scenarios where adding the 255th contact sensor broke the system as timeouts started being blown.

At the extreme, the quality of Thing simulations could be a significant driver of overall service quality and cost.

ghanajit .

unread,
Nov 14, 2016, 9:07:00 AM11/14/16
to continuou...@googlegroups.com
Thanks
Reply all
Reply to author
Forward
0 new messages