DataContract Design

2,301 views
Skip to first unread message

quaffapint

unread,
Dec 15, 2011, 8:11:48 AM12/15/11
to servic...@googlegroups.com
I'm using ServiceStack for my REST based service, with PetaPoco as my ORM (works with my Stored Procs, and has T4s to generate the models).

Here's a simple, common scenario.  I have an object Book, whose model is defined by PetaPeco.  I want to be able to lookup the book by Id, as well as Put an update and Post a new entry.  Here's the DataContract...

[DataContract]
[RestService("/books/{book_id}", "GET,PUT,POST")]
public class book_entry {

[DataMember]
public string book_id { get; set; }

[DataMember]
public book Book { get; set; }

}

...really basic.

So, here's the question(s)...

Based on what I want to do, is this a good setup - id and object.  It works fine in a Get scenario - it returns the requested object when passing the Id.  But for the Put scenario, I always get a 500 error on var responseStream = client.GetResponse().GetResponseStream()) in ServiceClientBase.cs.  I can post that code, but I didn't know if this was even the 'best' way to setup my DataContract.  In your examples, you define the model in the DataContract, but since I'm using PetaPoco to define my model, it isn't setup that way.

Thanks for any suggestions.

Jon Canning

unread,
Dec 15, 2011, 8:29:12 AM12/15/11
to servic...@googlegroups.com
I think you should keep your domain objects, i.e. Book, within your service and map them to a DTO. AutoMapper is your friend!

Demis Bellot

unread,
Dec 15, 2011, 11:05:08 AM12/15/11
to servic...@googlegroups.com
Agree with Jon, if your DTO and DB tables don't fit (as-is usually the case for anything non-trivial) just split them out and map between them. 
Whilst AutoMapper is the full-featured configurable mapper that does the trick, if your property names are the same you can also get by without the dependency by using the pre-built mappers in ServiceStack (which also supports the case when the property types are alike but not exact, e.g. Enum > String and vice-versa):

PopulateObject<To, From>(To to, From from)
PopulateWithNonDefaultValues<To, From>(To to, From from)
PopulateFromPropertiesWithAttribute<To, From>(To to, From from, Type attributeType)
TranslateTo<T>(this object from)

Cheers,

Jon Canning

unread,
Dec 15, 2011, 12:34:45 PM12/15/11
to servic...@googlegroups.com
Didn't know you did mapping too, that could save a dependency

Demis Bellot

unread,
Dec 15, 2011, 12:38:16 PM12/15/11
to servic...@googlegroups.com
Yep :)

I haven't needed to use AutoMapper for a long time as I prefer to do explicit assignments for anything that doesn't map conventionally.

D

quaffapint

unread,
Dec 15, 2011, 2:56:23 PM12/15/11
to servic...@googlegroups.com
I appreciate the reply, but I guess I'm feeling a little dense :-).  I understand the mapping concept, but not sure how to implement the PopulateObject (it would be just a straight mapping) with my DataContract.

Thanks again.

Demis Bellot

unread,
Dec 15, 2011, 8:09:03 PM12/15/11
to servic...@googlegroups.com
Hi,

Whenever there's a mismatch between your RDBMS Data Model classes and the preferred response you want your service to provide, you should define separate POCOs (i.e. DTOs) that define the data you want your services to return i.e. the Response DTO.
This is fairly common in batchful responses where your RDBMS's relational POCOs do not match the shape of your Response DTOs which tend to be more heirachal. You'd also want to do this if you only want to provide a subset (i.e. hide) of the data that's stored in your DB tables.

Here's a quick example of what I mean using the classes from the ServiceStack.OrmLite home page:
(Disclaimer: just wrote this in a text editor now so may not compile :)

//DataModel classes (i.e. DB Tables)
public class Customer {
    [AutoIncrement] 
    public int Id { get; set; }
    
    public string FirstName { get; set; }
    public string LastName { get; set; }

    [Index(Unique = true)] // Creates Unique Index
    public string Email { get; set; }
}

public class Order {
    [AutoIncrement]
    public int Id { get; set; }
    
    [References(typeof(Customer))] //Creates Foreign Key
    public int CustomerId { get; set; }
    
    public DateTime? OrderDate { get; set; }
    public decimal Freight { get; set; }
    public decimal Total { get; set; }
}

...

//DTOs - what your web service returns.
public class CustomerOrders
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public string Email { get; set; }
    public List<OrderSummary> Orders { get; set; }
}
public class OrderSummary
{
    public int OrderId { get; set; }
    public DateTime? OrderDate { get; set; }
    public decimal Freight { get; set; }
    public decimal Total { get; set; }
}

....

//I personally like to store Mapping logic separately in Ext methods
public static class TranslationExtensions
{
public static CustomerOrders ToCustomerOrders(this Customer from, List<Order> orders)
{
var to = customer.TranslateTo<CustomerOrders>();
to.Orders = orders.ConvertTo(x => x.ToOrderSummary());
}

public static OrderSummary ToOrderSummary(this Order from)
{
var to = from.TranslateTo<OrderSummary>();
to.OrderId = from.Id; //manually populate non-matching properties
return to;
}
}

...

public override object OnGet(CustomerOrders request)
{
var customer = dbCmd.QueryById<Customer>(1);
var custOrders = dbCmd.Where<Order>("CustomerId = @Id", new { customer.Id });
var dto = customer.ToCustomerOrders(custOrders); // <-- Mapping from DB to DTOs here
        return dto;
}


Here's another real-world example that uses both TranslateTo to initially create the DTO from the Users session and PopulateWith to populate the rest of the DTO from the DB User table:

public override object OnGet(UserProfile request)
{
var session = this.GetSession();

var userProfile = session.TranslateTo<UserProfile>();
userProfile.Id = int.Parse(session.UserAuthId);

var user = DbFactory.Exec(dbCmd => dbCmd.QueryById<User>(userProfile.Id));
userProfile.PopulateWith(user);

return new UserProfileResponse {
Result = userProfile
};
}




On Thu, Dec 15, 2011 at 2:56 PM, quaffapint <quaff...@gmail.com> wrote:
I appreciate the reply, but I guess I'm feeling a little dense :-).  I understand the mapping concept, but not sure how to implement the PopulateObject (it would be just a straight mapping) with my DataContract.

Thanks again.



quaffapint

unread,
Dec 15, 2011, 8:33:07 PM12/15/11
to servic...@googlegroups.com
Thanks, Demis.  From what I see though it looks like there's a lot of duplication of fields.  Like, if I have a customer DB model, and the web service is to return that customer model, do I have to actually type that customer model in yet again in my DataContract definition?

I guess because I have no changes between the DB model and what I want the service to return, I feel it should be very straight forward and no mapping needed.  I think that's where I'm getting lost. Like I have a customer, I want to look him up by ID and do a put to update him by ID.  Same model for both DB and Contract.

It's like I just want to do this...

[DataContract]
[RestService("/customers}", "GET,PUT,POST")]
public class customer_entry {

[DataMember]
public customer Customer { get; set; }

}

...pass in /{id} and have it mapped to Customer ID.  Even if I include an ID string in the Contract, the put still fails.

What I'm really looking for is the most simple to maintain.  That's why similar field entries in two locations (DB and Contract models) concerns me.  Makes it harder to make changes if you're hunting down fields in more than one simple model file.

Thanks again for taking the time to explain.

Demis Bellot

unread,
Dec 15, 2011, 8:43:17 PM12/15/11
to servic...@googlegroups.com
Although you may be reluctant to create and maintain the overhead of an additional class, they are 2 very different concerns and you will eventually reach a law of diminishing returns of trying to bend your class to do both persist to the DB and service your clients.

You can get away with using a single class when your response is fairly flat (i.e. not batchful) but as soon as you start to add DB fields you don't want in your response (e.g. auditing fields) you're going to struggle to make it work.

Also re-using the DataModel as your DTO has a tendancy to bind your web services API to your internal DB implementation so it will start to limit the re-factoring that's possible since every structural change has the potential to break your existing clients.

But yeah if its fairly flat and the data's public I too stick with a single Model and just create composite DTOs e.g.
https://github.com/ServiceStack/ServiceStack.Examples/blob/master/src/ServiceStack.Northwind/ServiceStack.Northwind.ServiceInterface/CustomerDetailsService.cs#L25

But the moment I start running into friction I split them out, because the alternate solutions quickly become ugly unmaintainable hacks.

Cheers,

Dan B

unread,
Dec 16, 2011, 5:43:59 AM12/16/11
to servic...@googlegroups.com
Whilst AutoMapper is the full-featured configurable mapper that does the trick, if your property names are the same you can also get by without the dependency by using the pre-built mappers in ServiceStack (which also supports the case when the property types are alike but not exact, e.g. Enum > String and vice-versa):


I had no idea that was there, cheers.

Ethan Brown

unread,
Dec 19, 2011, 2:53:12 PM12/19/11
to servic...@googlegroups.com
I totally agree with Demis that keeping different types at the service layer / DTO boundary and DB level is the way to go.  This is what I do as well as it promotes a better separation of concerns, decreased coupling and therefore can enable easier testing.

Demis - At what point do you being worrying about an Anemic domain model though?

I agree that something like REST Service / DTO -> Repository -> DB / POCO works for a fair amount of cases, on a smaller scale, and is perfectly acceptable in situations where the system is completely read-only or is simple persistence from client to back-end.

I'm curious to get your take (as I know you're opinionated on these things!), as to when you start to build out some sort of object model in the middle.  Obviously the benefits of building out the intermediate layer are to be able to separate object behavior / validation / business rules (for lack of any better term) from the service layer and the persistence machinery... to be able to share the model amongst multiple applications..  and of course to be able to test this domain model in isolation.  

The tradeoffs are extra maintenance burden, increased architectural complexity.  

Basically this boils down to -- do you start considering some form of DDD at a point when the services are doing too much?


Demis Bellot

unread,
Dec 19, 2011, 4:19:44 PM12/19/11
to servic...@googlegroups.com
Hey Ethan,

In the area of DTO -> (domain/data model/repository/etc) -> RDBMS I "go with my gut" here :) and use DRY as my guiding light.
I have the following general guidelines I go by, but not really follow a rule-book here...

I have a fundamental principle of getting everything into clean C# (ideally code-first) models as early as possible.
This applies to all external data-sources/artefacts i.e. not just RDBMS access (which the anemic model mainly applies), e.g. instead of xml parsing in my domain logic or dealing with the intricities of extracting information from LDAP, I prefer to deserialize directly into a DTO (if I'm able to) otherwise I will stuff the logic of parsing XML or LDAP nodes behind another DDD service which returns clean POCOs so the knowledge of how to parse a certain XML document or LDAP tree stays cohesive (is invisible to others) and doesn't leak sporadically into other areas of the system.

Inside of the repository though I impose no artificial limitations/patterns and give myself complete freedom to do implement the functionality that needs to be done with the least amount of effort.

I make a point of trying to have as much business logic around the Data Model as possible by either having helper methods directly on the class or extension methods (kept else where) if that helper method would force a project dependency (e.g. mapping from DTO <-> DataModel).
I draw the line by moving domain logic that requires external (DDD) dependencies (e.g. cache, email service, etc) off from the domain model but still kept inside the Repository so anything accessing the service is none-the-wiser.

I'm not too worried that I might accidentally become an Anemic DataModel victim since I see a couple of problems with it namely they're suggesting that 

In essence the problem with anemic domain models is that they incur all of the costs of a domain model, without yielding any of the benefits. The primary cost is the awkwardness of mapping to a database, which typically results in a whole layer of O/R mapping. 

This is not a problem when using Micro ORMs because your there's no extra mapping layer since your code-first POCO is the "master" authoritative definition that maps 1:1 with the underlying DB table. 
E.g. if you use something like OrmLite a dbCmd.Save(model); or dmCmd.Insert(model); is all you need to do (i.e. no config/mapping req'd) to persist your changes to your RDBMS table.
It's also not a problem when you're using a NoSQL datastore which just blobs your domain model / or stores it hierical-ly which is less of a mis-match to OOP than an RDBMS.
And with the advent of CQRS it suggests to keep the querying off the data model (which I was doing before it became fashionable :).

In summary, I'm not concerned with the Anemic DataModel, but still try to center business logic around the Data/Domain Model and I use DDD Services liberally to encapsulate concerns - i.e. I never worry about my DDD Services doing too much and I think the separation happens naturally with DRY, i.e. if you see logic repeated in unrelated DDD Services I will either refactor out shared logic into common Extension Methods/Utils or if it requires external dependencies, another shared DDD Service.

BTW I always start with the least DDD Services (i.e. initally 1 repository for all data access) and refactor them out into smaller more manageable pieces when the code-base size suggests to - i.e. I don't prematurely abstract :)

Not sure how much of the above is just me rambling, but hopefully it makes sense to others :)

Cheers,
Reply all
Reply to author
Forward
0 new messages