[0.9] Building a service layer using the ReactiveMongo Play plugin

2,213 views
Skip to first unread message

Pedro De Almeida

unread,
May 10, 2013, 9:34:33 AM5/10/13
to reacti...@googlegroups.com
Hello ReactiveMongo team!

I have been playing with the ReactiveMongo Play plugin and successfully managed to integrate the driver into our existing application (Play 2.1.1). Nice work!

Nevertheless, I am wondering how to design and build a service (and maybe a DAO) layer on top of the ReactiveMongo plugin. Realizing that using the MongoController trait requires a selftype of play.api.mvc.Controller, I looked into ReactiveMongoPlugin and created a very basic service (without error handling):

object UserService {

    def collection = ReactiveMongoPlugin.db.collection[JSONCollection]("users")

    implicit def ec: ExecutionContext = ExecutionContext.Implicits.global

    def create(user: UserModel): Future[BSONObjectID] = {
        val objectId = BSONObjectID.generate
        val data = user.copy(_id = Option(objectId), createdAt = Some(new DateTime()), updatedAt = Some(new DateTime()))
        collection.insert(data).map {
            _ => objectId
        }
    }
}


... used by controllers in this way:

object UsersController extends Controller {

    def create = Action {
        implicit request =>
            ResourceForms.user.bindFromRequest.fold(
                errors => Ok(views.html.pages.users.create(errors)),
                user => Async {
                    UserService.create(user).map( _ =>
                        Redirect(routes.UsersController.index())
                    )
                }
            )
    }
}


Is this solution a good approach? What are the pitfalls of using directly ReactiveMongoPlugin instead of MongoController to access the database layer?

Thanks a lot!

Pedro De Almeida



Stephane Godbillon

unread,
May 10, 2013, 5:30:23 PM5/10/13
to reacti...@googlegroups.com
Hi,

You can use does ReactiveMongoPlugin indeed – the only purpose of MongoController is to add helpers for controllers such as gridfs bodyparser.



2013/5/10 Pedro De Almeida <alme...@gmail.com>


Pedro De Almeida



--
You received this message because you are subscribed to the Google Groups "ReactiveMongo - http://reactivemongo.org" group.
To unsubscribe from this group and stop receiving emails from it, send an email to reactivemong...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Alex Jarvis

unread,
May 11, 2013, 4:38:47 AM5/11/13
to reacti...@googlegroups.com
Hi Pedro,

You may find the following slides useful: https://speakerdeck.com/alexanderjarvis/experiences-using-reactivemongo-to-build-a-json-api-with-play unfortunately the video has not been made available yet.

I have created my own similar DAO/Service abstraction so that it's cleaner to work with multiple collections etc from controller code as you have. I think it's a good approach for when you want to have a separation of concerns and the ability to call you mongo code from something that is not a play controller. Sometimes however it's nice to keep this layer as thin as possible and so you may implement one of your controller functions this way.

Thanks,
Alex

Pedro De Almeida

unread,
May 11, 2013, 6:59:38 AM5/11/13
to reacti...@googlegroups.com
Hello Stéphane,

Yes, I have seen that MongoController provides some helpers for parsing and serving files. As a side note in the docs (or README.md for now), it could be interesting to encourage the usage of ReactiveMongoPlugin .

Thanks for the plugin, already waiting for the 1.0 release! ;)

Pedro De Almeida

Pedro De Almeida

unread,
May 11, 2013, 7:47:15 AM5/11/13
to reacti...@googlegroups.com
Hello Alex,

Thanks for the slides! In fact, I saw a little bit late that there was an old thread about DAOs:
https://groups.google.com/forum/#!topic/reactivemongo/EDtMTI4aJfM

You and Marius Soutier are providing some examples on building the DAO layer. I will take some inspiration on these gists to build my own DAO / service layers and eventually share the code. The goal in short term is to extract a clean RESTful JSON API which, as you said, could be called from any external service.

Looking at your gist (https://gist.github.com/alexanderjarvis/5141288), I'm wondering how you deal with the ExecutionContext when you call your DAO from the controller. Do you always import an ExecutionContext in controllers? Is there a way to avoid this and force execution on the DAO context, e.g. using closures:

object UserDao extends DocumentDAO[UserModel] {

    val collectionName = "users"

    /* Removed code */
}

trait DocumentDAO[T <: Identifiable] {

    val collectionName: String


    implicit def ec: ExecutionContext = ExecutionContext.Implicits.global

    lazy val collection = ReactiveMongoPlugin.db.collection[JSONCollection](collectionName)

    def findAll[S](f: List[T] => S)(implicit reader: Reads[T]) /*: Future[S]*/ = {
        collection.find(Json.obj()).cursor[T].toList.map(documents => f(documents))
    }
}

object UsersController extends Controller {

    def index(page: Int = 1) = Action {
        implicit request =>
            Async {
                UserDao.findAll { users =>
                    Ok(pages.users.index(users))
                }
            }
    }
}


Tricky?

Pedro De Almeida

Alex Jarvis

unread,
May 11, 2013, 10:30:43 AM5/11/13
to reacti...@googlegroups.com
Hi Pedro,

Looking at your gist (https://gist.github.com/alexanderjarvis/5141288), I'm wondering how you deal with the ExecutionContext when you call your DAO from the controller. Do you always import an ExecutionContext in controllers? Is there a way to avoid this and force execution on the DAO context, e.g. using closures:

Sometimes I use a MongoController and other times I think I am importing:

import play.api.libs.concurrent.Execution.Implicits._

which defines the execution context. I think originally I had ExecutionContext.Implicits.global defined in the DAO trait which worked well but now I'm overriding it in each DAO trait and it doesn't seem to take effect.

I think it needs a redesign as I originally implemented it when first learning Scala. Ideally, it would be nice to specify an EC for each model/dao and also with a default provided, although I'm beginning to wonder if I will ever need this and so setting just one ExecutionContext for all would suffice.

Cheers,
Alex

--
You received this message because you are subscribed to a topic in the Google Groups "ReactiveMongo - http://reactivemongo.org" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/reactivemongo/pISkeicl0Fk/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to reactivemong...@googlegroups.com.

jakob dobrzynski

unread,
May 20, 2013, 8:04:09 AM5/20/13
to reacti...@googlegroups.com
Great work on the DAO Alex! I'm just curious, have you created a new DAO for the 0.9 version of ReactiveMongo?
If so is it possible to see some examples?

Regards,

Jakob

Pedro De Almeida

unread,
May 31, 2013, 11:56:08 AM5/31/13
to reacti...@googlegroups.com
Hello Jakob again,

You may find this multi-gist usefull (thanks to Alex and Marius for the great inspiration):
https://gist.github.com/almeidap/5685801#file-documentdao-scala

Note that this is still work in progress as there are some aspects (e.g.: var fields in traits and case classes) that I would like to improve. You can use the DAO in this way:

UserModel.scala

case class UserModel(
    override var _id: Option[BSONObjectID] = None,
    override var createdAt: Option[DateTime] = None,
    override var updatedAt: Option[DateTime] = None,
    email: String,
    password: Option[String] = None
) extends TemporalModel

object UserModel {
    import play.modules.reactivemongo.json.BSONFormats._

    // Generates Writes and Reads thanks to Json Macros
    implicit val format = Json.format[UserModel]
}

UserDao.scala

object UserDAO extends DocumentDAO[UserModel] {

    val collectionName = "users"

    def findByEmail(email: String): Future[Option[UserModel]] = findOne(Json.obj("email" -> email))

    override def ensureIndexes = {
        Future.sequence(
            List(
                ensureIndex(List("email" -> Ascending), unique = true)
            )
        )
    }
}

MyController.scala

def create() = Action {
    Async {
        val user = UserModel("f...@bar.com", password = Some("123456"))
        UserDAO.insert(user).map {
            result => result.fold(
                exception => throw UnexpectedServiceException("Unable to create user: [data=$user]", exception),
                userId => {
                    Logger.info(s"User successfully created: [_id=$userId]")
                    Ok(user.toString)
                }
            )
        }
    }
}

def find(email: String) = Action {
    Async {
        UserDAO.findByEmail(email).map {
            case Some(user) => {
                Ok(user.toString)
            }
            case None => {
                NotFound
            }
        }
    }
}


Happy to discuss for improvements!

Pedro De Almeida

jakob dobrzynski

unread,
Jun 1, 2013, 11:40:26 AM6/1/13
to reacti...@googlegroups.com
Hello!

Great work and thank you for this! This should really be
a part of the documentation!

Kind regards,

Jakob

Pedro De Almeida

unread,
Jun 4, 2013, 10:25:16 AM6/4/13
to reacti...@googlegroups.com
DAO layer updated with fixed Future recovery:
https://gist.github.com/almeidap/5685801

Enjoy! ;)

Pedro De Almeida

jakob dobrzynski

unread,
Jun 4, 2013, 11:28:07 AM6/4/13
to reacti...@googlegroups.com
Hello!
I'm getting an error when trying to update my model:

Model:

package models.new_booking

/**
 * User: jakobdobrzynski
 * Date: 6/3/13
 * Time: 9:18 AM
 */

import org.joda.time.{DateTime}
import play.api.libs.functional.syntax._
import play.api.libs.json._
import models.booking.User
import models.booking.BookingTime
import models.booking.BookingState
import reactivemongo.bson.BSONObjectID
import utils.mongo.TemporalModel


case class BookingModel (override var _id: Option[BSONObjectID] = None,
                         override var createdAt: Option[DateTime] = None,
                         override var updatedAt: Option[DateTime] = None,
                    recoId: Long,
                    user: User,
                    bookingTime: BookingTime,
                    numOfGuest: Int,
                    status: BookingState.BookingState) extends TemporalModel {

  def accepted(): BookingModel = {
    this.copy(status = BookingState.ACCEPTED)
  }

  def sent(): BookingModel = {
    this.copy(status = BookingState.SENT)
  }

  def denied(): BookingModel = {
    this.copy(status = BookingState.DENIED)
  }

  def denyWithNewTimeSuggestion(): BookingModel = {
    this.copy(status = BookingState.DENIED_NEW_TIME_SUGGESTED)
  }

  def timeout(): BookingModel = {
    this.copy(status = BookingState.TIMED_OUT)
  }

  def sendOnOpening(): BookingModel = {
    this.copy(status = BookingState.ON_HOLD)
  }
}

object BookingModel {

  /* Implicits */
  import play.modules.reactivemongo.json.BSONFormats._

  val pattern = "yyyy-MM-dd'T'HH:mm:ssz"
  implicit val dateFormat =
    Format[DateTime](Reads.jodaDateReads(pattern), Writes.jodaDateWrites(pattern))

  import utils.EnumUtils.enumReads
  implicit val bookingStateReads = enumReads(BookingState)


  import play.api.libs.json.Reads._
  implicit val bookingModelReads: Reads[BookingModel] = (
    (__ \ "_id").read[Option[BSONObjectID]] and
      (__ \ "createdAt").read[Option[DateTime]] and
      (__ \ "updatedAt").read[Option[DateTime]] and
      (__ \ "recoId").read[Long] and
      (__ \ "user").read[User] and
      (__ \ "bookingTime").read[BookingTime] and
      (__ \ "numOfGuest").read[Int] and
      (__ \ "status").read[BookingState.BookingState]
    )(BookingModel.apply _)

  import utils.EnumUtils.enumWrites

  import play.api.libs.json.Writes._
  implicit val bookingModelWrites: Writes[BookingModel] = (
    (__ \ "_id").write[Option[BSONObjectID]] and
      (__ \ "createdAt").write[Option[DateTime]] and
      (__ \ "updatedAt").write[Option[DateTime]] and
      (__ \ "recoId").write[Long] and
      (__ \ "user").write[User] and
      (__ \ "bookingTime").write[BookingTime] and
      (__ \ "numOfGuest").write[Int] and
      (__ \ "status").write[BookingState.BookingState]
    )(unlift(BookingModel.unapply))
}

method:
def crazystuff : Future[Option[AsyncBookingResponse]] = {

val flatFuture : Future[Option[AsyncBookingResponse]] = createBooking(bookingRequest.createBookingModel()).flatMap {
    booking => {
      val subject = "HELLO!"
      mailer.sendEmail(subject, "ja...@hello.se", MailContentBuilder.createBody(booking.get, bookingInquiryUrl))
      updateBooking(booking.get.copy(status = BookingState.SENT))      
    }
   }
    flatFuture
  }

  def retrieveVenue(someId: Long): Future[Option[BookableVenue]] = {
    venueRepo.findByRecoId(recoId).map {
      case Some(venue) => {
        Some(venue)
      }
      case None => None
    }
  }

  def createBooking(booking: BookingModel): Future[Option[BookingModel]] = {
    bookingRepo.insert(booking).map {
      result => result.fold(
        exception => {
          None
        },
        bookingId => {
          Some(booking.copy(_id = Some(bookingId)))
        }
      )
    }
  }

  def updateBooking(booking: BookingModel) : Future[Option[AsyncBookingResponse]] = {
    bookingRepo.update(booking._id.get.stringify, booking).map {
      result => result.fold(
        exception => {
          None
        },
        document => {
          Some(AsyncBookingResponse.okSendWhenOpening(document))
        }
      )
    }    
  }

Thank you!

jakob dobrzynski

unread,
Jun 4, 2013, 11:38:18 AM6/4/13
to reacti...@googlegroups.com
Have you any idea why I'm getting this? I'm not trying to overwrite the _id, only to update the document.
Kind regards,
Jakob

jakob dobrzynski

unread,
Jun 4, 2013, 12:43:40 PM6/4/13
to reacti...@googlegroups.com
Some more findings:

1. In the temporal:

var createdAt: Option[DateTime]
var updatedAt: Option[DateTime]

but in the DocumentDAO:

document.created = Some(DateTime.now)
document.updated = Some(DateTime.now)

2. Tried to do an update with then native stuff:

Doing update with:

val modifier = BSONDocument(
      "$set" -> BSONDocument(
        "updateDate" -> BSONDateTime(new DateTime().getMillis),
        "status" -> BSONString(booking.status.toString)))

    c.update(
      BSONDocument("_id" -> BSONObjectID(booking._id.get.stringify)),
      modifier,
      upsert = false).map(lastError => {
      Logger.info("Mongo LastError:%s".format(lastError))
      Some(AsyncBookingResponse.okSendWhenOpening(booking))
    })

works so I guess there must be some error when doing:

collection.update(DBQueryBuilder.id(id), DBQueryBuilder.set(document))

Interesting whats causing the _id write? is the writes spec in my BookingModel or something else?

3. Added:

case 10148 => {
                  Left(OperationNotAllowedException("", nestedException = e))
                }
in BaseDAO

to get:

[error] application - DB operation failed: [message=DatabaseException['Mod on _id not allowed' (code = 10148)]]

Regards

Jakob

Pedro De Almeida

unread,
Jun 4, 2013, 6:15:00 PM6/4/13
to reacti...@googlegroups.com
Hi Jakob!

Regarding your findings:
  1. Fixed, forgot to update TemporalModel in the multi-gist. I believe I should create a Play plugin with a Github repository instead, will try when I'll have more availability (and the new base FileDAO is ready...).
  2. Are you using upsert=true as discussed in another thread? Seems that you are trying to perform an upsert update operation using a model that already has a defined _id field (hint: Some(booking.copy(_id = Some(bookingId)))). Afaik, this is not allowed by MongoDB (cf. https://github.com/mongodb/node-mongodb-native/issues/473). But this suggests I could add a save() method to DocumentDAO, which delegates to JSONCollection#save() for completeness.
  3. Thanks for the contribution!

...and keep it Reactive! ;)

Pedro De Almeida

Message has been deleted

jakob dobrzynski

unread,
Jun 5, 2013, 2:28:32 AM6/5/13
to reacti...@googlegroups.com
Hello Pedro!

1. Yes that is a great idea!
2. I'm actually trying to do a regular update =) 

2.1 In the dao i'm sending in my document and doing:
collection.update(DBQueryBuilder.id(id), DBQueryBuilder.set(document), upsert=false)

DBQueryBuilder.set(document) will generate following json:
{"$set":{"_id":{"$oid":"51aed7819971ebfddd28c071"},"createdAt":1370412929896,"updatedAt":1370412930608,"recoId":1234567,"user":{"firstName":"jakob","lastName":"dobrzynski","email":"jakob@hello.com","mobile":"0735322852"},"bookingTime":{"date":1335736800000,"time":"18:30"},"numOfGuest":11,"status":"SENT"}}

since this field "_id":{"$oid":"51aed7819971ebfddd28c071"} is generated for the $set mongo will treat this as an update of the id.
I don't know how I should handle this. Should I try to create $set the object by hand without the _id field or 
should DBQueryBuilder.set exclude the _id field since as you said to updated a _id field is not allowed.

2.2 Also shouldn't 
def update(id: String, document: T)(implicit writer: Writes[T]): Future[Either[ServiceException, T]] = {
and
def update(id: String, data: JsObject): Future[Either[ServiceException, JsObject]] = {
have a upsert: Boolean?
like:
def update(id: String, document: T, upsert: Boolean)(implicit writer: Writes[T]): Future[Either[ServiceException, T]] = {

3. =)
 
Kind regards,

Jakob

Jun Yamog

unread,
Aug 21, 2013, 8:37:13 PM8/21/13
to reacti...@googlegroups.com
Hi,

We have been using play, reactivemongo, gridfs for the past few months.  We are also using it with solr, as our search engine.  Its been great and great learning also for all of us.  We started with a simple coast to coast light model, had a generic trait controller.  So far its taken us pretty far, we are re-evaluating our decisions, maybe get some insight what other people have experienced also and evolve our code.

Good parts:
- its really quick to bang in a new data type, collection, etc.
- so far its been more productive compared to a more traditional, hibernate, dao, spring, etc. for the problem space of storing different types of data with metadata

No so good parts or we don't know yet
- Because its model less, or light model.  The source of the truth of our model is the mongo db data itself, the scala model is partial representation of the data.  I think most people have similar models, id, created and updated.  On the above posts some call them identifiable and temporal, which is closer to say a hibernate/dao model works.  I guess the issue we have is, we still need the scala models, so that the writers for dates and object id can be performed.  Otherwise we need to somehow look for the attributes by name instead of type, so that when we persist it on mongo they become the special mongo type objectid and date fields.

How are you managing coast to coast, model less or light model if you have more than 5 model/collection?  Or adding the overhead of having a full scala model is more preferable?  Our scala models are effectively used as validators, so its a truth for a particular api, but not the full document model.

- This may not be specific to reactive mongo, however in our case.  We effectively have 3 json model.  1 varying slightly for from the other.  A front end json, where object id is moved to "id" attribute and stringify, same with dates, converted to number with epoch value.  A mongo json, which is the reverse of the frontend model.  A solr json, which removes, flattens and renames the fields to index.  So we got a few transformers in our code, which sometimes is a lot especially the solr transformers.  I will try to post a separate post on play mailing list, asking to help reduce this code.  How many json models do most people have here?  I guess 2 at the minimum, a frontend and a mongo.  We are actually investigating elastic search, as its schema less as the flexi field schema of solr is still a friction for us.

- We play app is very thin layered, as opposed to a service, dao, (like more traditional java apps).  So far its been great, simple and thin.  But is this very thin layer producing the above 2 issues we have.  We kinda only have 2 layer, a implementation controller and the generic one.  Are people still using the service, dao, model pattern because of familiarity or have they gone coast to coast/ model less and reverted back to more traditional stacked layers?

- Testing.  We found that speed is an issue for testing.  So what we did is we only test mongo backed test on the generic controller code.  All the upper layers we just mock the mongo results.  How are people testing their app?  We are using specs2 mockito trait

Thanks very much, it would be great if we can get some insights and feedback.


Jun




You received this message because you are subscribed to the Google Groups "ReactiveMongo - http://reactivemongo.org" group.
To unsubscribe from this group and stop receiving emails from it, send an email to reactivemong...@googlegroups.com.

Pascal Voitot Dev

unread,
Aug 22, 2013, 9:15:54 AM8/22/13
to reacti...@googlegroups.com
On Thu, Aug 22, 2013 at 2:37 AM, Jun Yamog <jky...@gmail.com> wrote:
Hi,

We have been using play, reactivemongo, gridfs for the past few months.  We are also using it with solr, as our search engine.  Its been great and great learning also for all of us.  We started with a simple coast to coast light model, had a generic trait controller.  So far its taken us pretty far, we are re-evaluating our decisions, maybe get some insight what other people have experienced also and evolve our code.

Good parts:
- its really quick to bang in a new data type, collection, etc.
- so far its been more productive compared to a more traditional, hibernate, dao, spring, etc. for the problem space of storing different types of data with metadata

No so good parts or we don't know yet
- Because its model less, or light model.  The source of the truth of our model is the mongo db data itself, the scala model is partial representation of the data.  I think most people have similar models, id, created and updated.  On the above posts some call them identifiable and temporal, which is closer to say a hibernate/dao model works.  I guess the issue we have is, we still need the scala models, so that the writers for dates and object id can be performed.  Otherwise we need to somehow look for the attributes by name instead of type, so that when we persist it on mongo they become the special mongo type objectid and date fields.

How are you managing coast to coast, model less or light model if you have more than 5 model/collection?  Or adding the overhead of having a full scala model is more preferable?  Our scala models are effectively used as validators, so its a truth for a particular api, but not the full document model.

- This may not be specific to reactive mongo, however in our case.  We effectively have 3 json model.  1 varying slightly for from the other.  A front end json, where object id is moved to "id" attribute and stringify, same with dates, converted to number with epoch value.  A mongo json, which is the reverse of the frontend model.  A solr json, which removes, flattens and renames the fields to index.  So we got a few transformers in our code, which sometimes is a lot especially the solr transformers.  I will try to post a separate post on play mailing list, asking to help reduce this code.  How many json models do most people have here?  I guess 2 at the minimum, a frontend and a mongo.  We are actually investigating elastic search, as its schema less as the flexi field schema of solr is still a friction for us.

- We play app is very thin layered, as opposed to a service, dao, (like more traditional java apps).  So far its been great, simple and thin.  But is this very thin layer producing the above 2 issues we have.  We kinda only have 2 layer, a implementation controller and the generic one.  Are people still using the service, dao, model pattern because of familiarity or have they gone coast to coast/ model less and reverted back to more traditional stacked layers?

- Testing.  We found that speed is an issue for testing.  So what we did is we only test mongo backed test on the generic controller code.  All the upper layers we just mock the mongo results.  How are people testing their app?  We are using specs2 mockito trait

 
Thanks very much, it would be great if we can get some insights and feedback.


A few quick arguments:

Just to precise, when I began speaking about coast to coast, I never said it should be considered as a universal approach :D
It's just an idea saying: one single way of modelling/validating data (mainly ORM) isn't good because it doesn't fit so well in all cases. Sometimes, you don't need a model, sometimes, you need it because in terms of business, a static model is better than pure Json.
I just want people to think a few seconds before diving into case classes or DAO or ORM and ask themselves: "do I need a static model or can I just validate Json and transmit those data to my DB directly?" (naturally this is very practical in MongoDB)

I've also studied alternatives solutions in this article with Play Autosource:
http://mandubian.com/2013/06/11/play-autosource/

Mongo is the perfect target for this approach: you begin without modelling clearly but just manipulating blobs of Json. Then progressively, you add validations. Potentially, you become aware that you need a real static model so you create it and use it in your autosource. My idea is to provide a versatile and iterative approach to modelling business data...

So in this approach, the Autosource plays the role of the DAO but doesn't enforce you to create a static model immediately. You can create just if you need it.

I tend to think that data stored in the DB should be trustable (in most cases) so if you just need to read those data, no need to go through a model, let just throw Json to the network.

Concerning transformers for complex operations, I don't believe Json Reads fit very well: I created JsZipper which is far better as soon as you need to perform more complex manipulations:
- http://mandubian.com/2013/05/01/reactive-json-crafting-jszipper-reactivemongo-webservice/
- http://mandubian.com/2013/07/04/json-interpolation-pattern-matching/

regards
Pascal

Jun Yamog

unread,
Aug 22, 2013, 7:05:54 PM8/22/13
to reacti...@googlegroups.com
Hi Pascal,

Thanks for the great reply.  more reply inline below


On Fri, Aug 23, 2013 at 1:15 AM, Pascal Voitot Dev <pascal.v...@gmail.com> wrote:
A few quick arguments:

No real arguments.  I just want to make it clear, that we think coast to coast is great.  It helped us get up to speed pretty fast, that is why I also put in "or we don't know yet".  As we are venturing pretty unfamiliar/different territory.  So learning from one another would be great.
 

Just to precise, when I began speaking about coast to coast, I never said it should be considered as a universal approach :D
It's just an idea saying: one single way of modelling/validating data (mainly ORM) isn't good because it doesn't fit so well in all cases. Sometimes, you don't need a model, sometimes, you need it because in terms of business, a static model is better than pure Json.
I just want people to think a few seconds before diving into case classes or DAO or ORM and ask themselves: "do I need a static model or can I just validate Json and transmit those data to my DB directly?" (naturally this is very practical in MongoDB)

Yes agreed, there is no silver bullets.  We also felt at the start coast to coast is great, right now we are just re-evaluating it.  We also want to know how others have done it.  I can say for our project it gave us a good flexible start.  Also a personal project of mine, which only had 3 models/collection it was perfect.
 

I've also studied alternatives solutions in this article with Play Autosource:
http://mandubian.com/2013/06/11/play-autosource/

Mongo is the perfect target for this approach: you begin without modelling clearly but just manipulating blobs of Json. Then progressively, you add validations. Potentially, you become aware that you need a real static model so you create it and use it in your autosource. My idea is to provide a versatile and iterative approach to modelling business data...

So in this approach, the Autosource plays the role of the DAO but doesn't enforce you to create a static model immediately. You can create just if you need it.

Thanks for the autosource.  I think its great and better than what we are doing.  I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code.  Its similar to what we have and what we want to evolve the code to.  You are just way ahead of us.  So I have decided to create a branch, yank our CRUD stuff... refactor our code.  If the experiment is successful we will likely just use autosource.

We are also using angular.js, and deadbolt for security.
 

I tend to think that data stored in the DB should be trustable (in most cases) so if you just need to read those data, no need to go through a model, let just throw Json to the network.

Yes I think so.  However I think its with ORM, dao, etc pattern, where the object is the source of the truth and data is just an implementation thinking is still ingrained with us.  Maybe its more of unfamiliarity rather than coast to coast is not good for our current problem set.

The autosource is pretty good though, start with modeless and use a type eventually.  I will give it a try.
 

Concerning transformers for complex operations, I don't believe Json Reads fit very well: I created JsZipper which is far better as soon as you need to perform more complex manipulations:
- http://mandubian.com/2013/05/01/reactive-json-crafting-jszipper-reactivemongo-webservice/
- http://mandubian.com/2013/07/04/json-interpolation-pattern-matching/

Thanks for this, will try it out.  Will try to autosource first.


Jun 

Pascal Voitot Dev

unread,
Aug 23, 2013, 3:13:01 AM8/23/13
to reacti...@googlegroups.com
On Fri, Aug 23, 2013 at 1:05 AM, Jun Yamog <jky...@gmail.com> wrote:
Hi Pascal,

Thanks for the great reply.  more reply inline below


On Fri, Aug 23, 2013 at 1:15 AM, Pascal Voitot Dev <pascal.v...@gmail.com> wrote:
A few quick arguments:

No real arguments.  I just want to make it clear, that we think coast to coast is great.  It helped us get up to speed pretty fast, that is why I also put in "or we don't know yet".  As we are venturing pretty unfamiliar/different territory.  So learning from one another would be great.
 

Actually I believe Coast2Coast is a good idea to have in mind at the same level as static models. But it's still limited by the Json structure itself which is not as typesafely heterogenous as we would like... Still need to go further about it!
 

Just to precise, when I began speaking about coast to coast, I never said it should be considered as a universal approach :D
It's just an idea saying: one single way of modelling/validating data (mainly ORM) isn't good because it doesn't fit so well in all cases. Sometimes, you don't need a model, sometimes, you need it because in terms of business, a static model is better than pure Json.
I just want people to think a few seconds before diving into case classes or DAO or ORM and ask themselves: "do I need a static model or can I just validate Json and transmit those data to my DB directly?" (naturally this is very practical in MongoDB)

Yes agreed, there is no silver bullets.  We also felt at the start coast to coast is great, right now we are just re-evaluating it.  We also want to know how others have done it.  I can say for our project it gave us a good flexible start.  Also a personal project of mine, which only had 3 models/collection it was perfect.
 

The problems of modelling with classes is that you need to create as many classes as views. It quickly leads to hundreds of classes which doesn't represent anything at all in terms of business! If you can keep one single real business class and do the rest with pure Json, it can be a good compromise!
 

I've also studied alternatives solutions in this article with Play Autosource:
http://mandubian.com/2013/06/11/play-autosource/

Mongo is the perfect target for this approach: you begin without modelling clearly but just manipulating blobs of Json. Then progressively, you add validations. Potentially, you become aware that you need a real static model so you create it and use it in your autosource. My idea is to provide a versatile and iterative approach to modelling business data...

So in this approach, the Autosource plays the role of the DAO but doesn't enforce you to create a static model immediately. You can create just if you need it.

Thanks for the autosource.  I think its great and better than what we are doing.  I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code.  Its similar to what we have and what we want to evolve the code to.  You are just way ahead of us.  So I have decided to create a branch, yank our CRUD stuff... refactor our code.  If the experiment is successful we will likely just use autosource.


yes test it, it's really low-level and quite simple so it can be hacked, modified pretty easily!
 
We are also using angular.js, and deadbolt for security.
 

I tend to think that data stored in the DB should be trustable (in most cases) so if you just need to read those data, no need to go through a model, let just throw Json to the network.

Yes I think so.  However I think its with ORM, dao, etc pattern, where the object is the source of the truth and data is just an implementation thinking is still ingrained with us.  Maybe its more of unfamiliarity rather than coast to coast is not good for our current problem set.


Data is the truth IMO, not classes :D
In 5 years, nobody will care about your classes and you don't know the language that will be used but your data will still be there!
I prefer to consider data-centric architecture.
But technically speaking, when you are in a compiled typesafe language and you need to perform some complex operations with your data, naturally your language forces you to go to static structure such as classes!
 
The autosource is pretty good though, start with modeless and use a type eventually.  I will give it a try.
 

Concerning transformers for complex operations, I don't believe Json Reads fit very well: I created JsZipper which is far better as soon as you need to perform more complex manipulations:
- http://mandubian.com/2013/05/01/reactive-json-crafting-jszipper-reactivemongo-webservice/
- http://mandubian.com/2013/07/04/json-interpolation-pattern-matching/

Thanks for this, will try it out.  Will try to autosource first.


yes sure!
don't fear zipper, it's not complicated at: it's one of those great concepts that are quite simple to catch and use ;)
 

Jun 

Jun Yamog

unread,
Aug 23, 2013, 5:59:13 AM8/23/13
to reacti...@googlegroups.com
Hi Pascal,

On Fri, Aug 23, 2013 at 7:13 PM, Pascal Voitot Dev <pascal.v...@gmail.com> wrote:
Yes agreed, there is no silver bullets.  We also felt at the start coast to coast is great, right now we are just re-evaluating it.  We also want to know how others have done it.  I can say for our project it gave us a good flexible start.  Also a personal project of mine, which only had 3 models/collection it was perfect.
 

The problems of modelling with classes is that you need to create as many classes as views. It quickly leads to hundreds of classes which doesn't represent anything at all in terms of business! If you can keep one single real business class and do the rest with pure Json, it can be a good compromise!

Yes this is true especially the problem space is mostly just data manipulation, rather than real computing.
 

Thanks for the autosource.  I think its great and better than what we are doing.  I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code.  Its similar to what we have and what we want to evolve the code to.  You are just way ahead of us.  So I have decided to create a branch, yank our CRUD stuff... refactor our code.  If the experiment is successful we will likely just use autosource.


yes test it, it's really low-level and quite simple so it can be hacked, modified pretty easily!

Yes started hacking already your sample play app.  Already integrated on my personal project, started to create a branch on our app.

There are few things I saw which mainly a slightly different convention using with Angular and likely any other js framework that use data binding.  On the sample app I noticed that a query for the whole collection is performed for create, update and delete operation (2 REST call).  I have changed it a bit, so that query is only performed when the whole collection is loaded.  create, update and delete only performed on single items (1 REST call).  See diff below:


I also added a crude error checking.  I also override angular update method to be using PUT, this is also what you have done on the AutoSourceController.  As I think before what was happening its calling $save and is POST-ing on / with an id.  Which is create, but since its got an id it becomes effectively becomes an update.

Because of this I also override for now the insert and update methods of ReactiveMongoAutoSourceController.  For the update I changed it to be 204 instead of a 200 returning just the id.  As if its a 200 with an id only, what happens is that angular will data-bind this data... which effectively deletes all the attributes aside from the id.  On the insert, I now return whole model + id, instead of just the id only then just insert that whole object in the JS array.  Returning the whole object is helpful, especially if the backend adds a few bits to the object such as createdOn, createdBy, etc.

Hope this is helpful, will try to make a pull request if you are ok with this.  I will also update the other sample app, I may need to ask help on getting other db backend working.
 

Data is the truth IMO, not classes :D
In 5 years, nobody will care about your classes and you don't know the language that will be used but your data will still be there!
I prefer to consider data-centric architecture.
But technically speaking, when you are in a compiled typesafe language and you need to perform some complex operations with your data, naturally your language forces you to go to static structure such as classes!
 

Yes for a data centric app this is true. 

Pascal Voitot Dev

unread,
Aug 23, 2013, 6:05:20 AM8/23/13
to reacti...@googlegroups.com
On Fri, Aug 23, 2013 at 11:59 AM, Jun Yamog <jky...@gmail.com> wrote:
Hi Pascal,

On Fri, Aug 23, 2013 at 7:13 PM, Pascal Voitot Dev <pascal.v...@gmail.com> wrote:
Yes agreed, there is no silver bullets.  We also felt at the start coast to coast is great, right now we are just re-evaluating it.  We also want to know how others have done it.  I can say for our project it gave us a good flexible start.  Also a personal project of mine, which only had 3 models/collection it was perfect.
 

The problems of modelling with classes is that you need to create as many classes as views. It quickly leads to hundreds of classes which doesn't represent anything at all in terms of business! If you can keep one single real business class and do the rest with pure Json, it can be a good compromise!

Yes this is true especially the problem space is mostly just data manipulation, rather than real computing.
 

Thanks for the autosource.  I think its great and better than what we are doing.  I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code.  Its similar to what we have and what we want to evolve the code to.  You are just way ahead of us.  So I have decided to create a branch, yank our CRUD stuff... refactor our code.  If the experiment is successful we will likely just use autosource.


yes test it, it's really low-level and quite simple so it can be hacked, modified pretty easily!

Yes started hacking already your sample play app.  Already integrated on my personal project, started to create a branch on our app.

There are few things I saw which mainly a slightly different convention using with Angular and likely any other js framework that use data binding.  On the sample app I noticed that a query for the whole collection is performed for create, update and delete operation (2 REST call).  I have changed it a bit, so that query is only performed when the whole collection is loaded.  create, update and delete only performed on single items (1 REST call).  See diff below:


I also added a crude error checking.  I also override angular update method to be using PUT, this is also what you have done on the AutoSourceController.  As I think before what was happening its calling $save and is POST-ing on / with an id.  Which is create, but since its got an id it becomes effectively becomes an update.

Because of this I also override for now the insert and update methods of ReactiveMongoAutoSourceController.  For the update I changed it to be 204 instead of a 200 returning just the id.  As if its a 200 with an id only, what happens is that angular will data-bind this data... which effectively deletes all the attributes aside from the id.  On the insert, I now return whole model + id, instead of just the id only then just insert that whole object in the JS array.  Returning the whole object is helpful, especially if the backend adds a few bits to the object such as createdOn, createdBy, etc.

Hope this is helpful, will try to make a pull request if you are ok with this.  I will also update the other sample app, I may need to ask help on getting other db backend working.

yes PR are welcome!
Angular sample was just for fun... I didn't know angular at all and just played with it to see how it behaved with autosource..
I wonder about the insert returning Id+Object... returning just the ID should fit all DB models and is quite efficient in terms of resources... maybe could be another endpoint insertAndGet...
Don't hesitate to send PR ;)
 
 

Data is the truth IMO, not classes :D
In 5 years, nobody will care about your classes and you don't know the language that will be used but your data will still be there!
I prefer to consider data-centric architecture.
But technically speaking, when you are in a compiled typesafe language and you need to perform some complex operations with your data, naturally your language forces you to go to static structure such as classes!
 

Yes for a data centric app this is true. 

--

Jianfeng Tian

unread,
Aug 28, 2013, 7:07:45 AM8/28/13
to reacti...@googlegroups.com
Just wonder, would it be simpler to write:

implicit val fmt = Json.format[User]

Instead of writing Reader and Writer?

Andrius A

unread,
Oct 14, 2013, 9:04:55 AM10/14/13
to reacti...@googlegroups.com
This DAO is amazing. Makes life much easier! 
However I am a newbie in scala + play world and finding it difficult to find a way to pre-generate the BSONObjectId before saving the model, because I don't want to wait for to be returned by mongodb.

I did the following:

val prefId = BSONObjectID.generate
val pref = Preference(_id=Some(prefId), name="installed", value="true")

but play complains with the exception:

Unexpected exception

NumberFormatException: For input string: "So"


java.lang.NumberFormatException: For input string: "So"
     java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
     java.lang.Integer.parseInt(Integer.java:449)
     reactivemongo.bson.utils.Converters$.str2Hex(utils.scala:39)
     reactivemongo.bson.BSONObjectID.<init>(types.scala:306)
     common.dao.DocumentDAO$class.insert(DocumentDAO.scala:24)
     models.PreferenceDAO$.insert(Preference.scala:37)
     Global$.onStart(Global.scala:30)


Your help would be much appreciated!

Thanks.

Pedro De Almeida

unread,
Oct 14, 2013, 2:37:20 PM10/14/13
to reacti...@googlegroups.com
Hello Andrius,

Thanks for your feedback! ;)

Regarding your problem, can you show me your Preference model? Is your model extending my IdentifiableModel or did you model your Preference identifier as a String?

If you are beginning with Scala, Play & MongoDB, you should really consider to document on asynchronous programming. In your situation, I would probably use Scala Futures (nesting asynchronous calls or using onComplete/onSuccess callbacks) to achieve your requirements. Have a look to this great article:

Good luck!

-- Pedro De Almeida

Andrius A

unread,
Oct 14, 2013, 3:14:49 PM10/14/13
to reacti...@googlegroups.com
Hi Pedro, your DAO is a life saver :) it is worth launching it as a dedicated project and would benefit a lot of users! Especially the ones who are the new in play + scala

I am extending TemporalModel, not sure whether thats correct? Here is my Preference model:

case class Preference (

  override var _id: Option[BSONObjectID] = None,
  override var created: Option[DateTime] = None,
  override var updated: Option[DateTime] = None,

  var name: String,
  var value: String,
  var valueType: Option[DateTime] = None,
  var category: Option[String] = Some("core"),
  var allowOverride: Option[Boolean] = Some(false)

) extends TemporalModel

object Preference {
  import play.modules.reactivemongo.json.BSONFormats._

  implicit val format = Json.format[Preference]
}

and I am saving it like this:

val prefId = BSONObjectID.generate

Logger.debug("New BSONObjectId: " + prefId)

val pref = Preference(_id=prefId, name="installed", value="true")

PreferenceDAO.insert(pref).map {
      result => result.fold(
        exception => throw UnexpectedServiceException("Unable to create pref: [data=$pref]", exception),
        preferenceId => {
          Logger.info(s"Preference successfully created: [_id=$preferenceId]")
        }
      )
}

Thanks for your help!

Pedro De Almeida

unread,
Oct 14, 2013, 3:38:30 PM10/14/13
to reacti...@googlegroups.com
Hi all!

I am glad to see that this topic is still getting attention and constructive inputs.

Regarding static models vs JSON Coast-to-Coast, there are some considerations that I would like to add to complete your arguments. When I model/design business data, I try to take into account readability, particularly in a multi-developer context; simple and comprehensive models as well as clear and intuitive APIs! I believe that Pascal's JSON Coast-to-Coast approach allows more flexibility during development but, IMO, brings less methodology to (local) developers, which can lead to code consistency/maintenance issues. Depending on the level of validation your data requires, JSON Coast-to-Coast can do the job.

Regarding model views, I try to use implicit Writers or Writeables depending on the context in order to avoid to duplicate my model classes. Here is how I customize the output when writing a Result:

/**
 * Implicits formats for JSON based models.
 *
 * @author      Pedro De Almeida (almeidap)
 */
trait JsonFormats {

implicit val bsonOBjectIDFormat = play.modules.reactivemongo.json.BSONFormats.BSONObjectIDFormat

/**
* Writeable for [[play.api.mvc.Result]] content.
* @param w
* @param wjs
* @tparam A
* @return
*/
implicit def writeable[A](implicit w: Writes[A], wjs: Writeable[JsValue]): Writeable[A] = {
wjs.map(a  => {
Json.toJson(a) match {
case json: JsObject => transform(json)
case json: JsArray => Json.toJson(
for {
item <- json.value
} yield transform(item)
)
case _ => throw UnexpectedException(Some("Unsupported JSON type for writing!"))
}
})
}

def transform(json: JsValue) = {
json.transform(
__.json.update (
(__ \ 'id).json.copyFrom((__ \ '_id \ '$oid).json.pick)
) andThen (
(__ \ '_id).json.prune
)
).fold(
error => json,
success => success
)

Pedro De Almeida

unread,
Oct 14, 2013, 4:15:27 PM10/14/13
to reacti...@googlegroups.com
Hm, your code should work supposing that you made a typo here:

val pref = Preference(_id=prefId, name="installed", value="true")

...should be:

val pref = Preference(_id=Some(prefId), name="installed", value="true")

Can you post the relevant console output when the exception arises? Also note thatPreferenceDAO#insert will generate another BSONObjectID, which means that your prefId variable will not reference the created document!

I just updated the DAO gist with minor changes and fixes, could you give it a try?

Thanks,

-- Pedro De Almeida

Andrius A

unread,
Oct 14, 2013, 5:21:24 PM10/14/13
to reactivemongo
Thanks Pedro, your updated version now worked for me! However, as you said, new BSONObjectID was created even if I provided my one.
So I had to change the insert() method of DocumentDAO only to generate BSONObjectID only if it hasn't been already provided as the following:

if(document._id == None) document._id = Some(BSONObjectID.generate) //probably there is a nicer way to do this in scala!

I agree, this approach might help in some situations but also will cause problems where insertion fails but one tries to consume the generated id thinking that document exists in db.

Andrius A

unread,
Oct 14, 2013, 7:49:23 PM10/14/13
to reactivemongo
Actually, the compilation now fails and looks like toList is depricated in scala 2.10.2:

DocumentDAO.scala:34: missing arguments for method toList in trait Cursor;
[error] follow this method with `_' if you want to treat it as a partially applied function
[error] collection.find(query).cursor[T].toList

If I change collection.find(query).cursor[T].toList() then its showing deprication warning

Stephane Godbillon

unread,
Oct 15, 2013, 3:46:36 PM10/15/13
to reacti...@googlegroups.com
Hi everyone,

About the deprecation warning and the compilation error, this is due to my recent refactoring of Cursor.  Sorry about that ;)
Actually it may break a little more in the next couple of weeks – I'm actively working to make this API better.  The last commits made error handling much more efficient, and remove the annoying (and useless) "killCursors: found 0 of 1".
By the way, if you have questions or comments about the new stuff, don't hesitate to tell me!

Cheers,

Pedro De Almeida

unread,
Mar 20, 2014, 5:51:48 AM3/20/14
to reacti...@googlegroups.com
Hello guys,

FYI, I just updated the DAO design gists with some improvements and the introduction of a FileDAO: https://gist.github.com/almeidap/5685801

Any comments or suggestions are welcome!

@Stéphane: the DAO code runs smoothly with ReactiveMongo 0.10.0, thanks for the release!

--
Pedro

Pedro De Almeida

unread,
Mar 20, 2014, 6:12:34 AM3/20/14
to reacti...@googlegroups.com
Hi again,

Just posted a concrete implementation of some FileDAO on this multi-file gist:

The ImageDAO uses Scrimage (https://github.com/sksamuel/scrimage) for resizing images using an iterator for consuming the original file enumerator.

Please, let me know what do you think about this way of manipulating files and if it can be improved.

Thanks,
--
Pedro

Stephane Godbillon

unread,
Apr 16, 2014, 2:56:43 AM4/16/14
to reacti...@googlegroups.com
Thanks for sharing, nice example :) The only drawback is that the file into memory, but you don't really have any other choice here.
For more options, visit https://groups.google.com/d/optout.

Pedro De Almeida

unread,
Apr 16, 2014, 9:24:35 AM4/16/14
to reacti...@googlegroups.com
Hello Stéphane,

Feel free to challenge even more my work and to guide me for a better ReactiveMongo integration. I am currently in the process of refactoring this DAO design and, as you are developing the ReactiveMongo project, any input from you would really be appreciated, even if it goes against my implementation!

> The only drawback is that the file into memory, but you don't really have any other choice here.

In which method, FileDAO#findOne() or ImageDAO#duplicate()? How could it be improved?

By the way, just updated FileDAO gist to include the missing insert() method: https://gist.github.com/almeidap/5685801#file-filedao-scala-L29

Pedro

Alejandro Vidal Castillo

unread,
Jul 13, 2016, 6:56:21 PM7/13/16
to ReactiveMongo - http://reactivemongo.org
It's 2016 here and I wonder how can I implement this great DAO design with the current traits of ReactiveMongo (ReactiveMongoPlugin is deprecated)

Cédric Chantepie

unread,
Jul 13, 2016, 9:32:37 PM7/13/16
to ReactiveMongo - http://reactivemongo.org
As indicated in the docs, the new trait for Play context is ReactiveMongoApi.
Reply all
Reply to author
Forward
0 new messages