Pedro De Almeida
--
You received this message because you are subscribed to the Google Groups "ReactiveMongo - http://reactivemongo.org" group.
To unsubscribe from this group and stop receiving emails from it, send an email to reactivemong...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
Looking at your gist (https://gist.github.com/alexanderjarvis/5141288), I'm wondering how you deal with the ExecutionContext when you call your DAO from the controller. Do you always import an ExecutionContext in controllers? Is there a way to avoid this and force execution on the DAO context, e.g. using closures:
--
You received this message because you are subscribed to a topic in the Google Groups "ReactiveMongo - http://reactivemongo.org" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/reactivemongo/pISkeicl0Fk/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to reactivemong...@googlegroups.com.
var createdAt: Option[DateTime]var updatedAt: Option[DateTime]but in the DocumentDAO:document.created = Some(DateTime.now)document.updated = Some(DateTime.now)2. Tried to do an update with then native stuff:
...and keep it Reactive! ;)
Pedro De Almeida
You received this message because you are subscribed to the Google Groups "ReactiveMongo - http://reactivemongo.org" group.
To unsubscribe from this group and stop receiving emails from it, send an email to reactivemong...@googlegroups.com.
Hi,We have been using play, reactivemongo, gridfs for the past few months. We are also using it with solr, as our search engine. Its been great and great learning also for all of us. We started with a simple coast to coast light model, had a generic trait controller. So far its taken us pretty far, we are re-evaluating our decisions, maybe get some insight what other people have experienced also and evolve our code.Good parts:- its really quick to bang in a new data type, collection, etc.- so far its been more productive compared to a more traditional, hibernate, dao, spring, etc. for the problem space of storing different types of data with metadataNo so good parts or we don't know yet- Because its model less, or light model. The source of the truth of our model is the mongo db data itself, the scala model is partial representation of the data. I think most people have similar models, id, created and updated. On the above posts some call them identifiable and temporal, which is closer to say a hibernate/dao model works. I guess the issue we have is, we still need the scala models, so that the writers for dates and object id can be performed. Otherwise we need to somehow look for the attributes by name instead of type, so that when we persist it on mongo they become the special mongo type objectid and date fields.How are you managing coast to coast, model less or light model if you have more than 5 model/collection? Or adding the overhead of having a full scala model is more preferable? Our scala models are effectively used as validators, so its a truth for a particular api, but not the full document model.- This may not be specific to reactive mongo, however in our case. We effectively have 3 json model. 1 varying slightly for from the other. A front end json, where object id is moved to "id" attribute and stringify, same with dates, converted to number with epoch value. A mongo json, which is the reverse of the frontend model. A solr json, which removes, flattens and renames the fields to index. So we got a few transformers in our code, which sometimes is a lot especially the solr transformers. I will try to post a separate post on play mailing list, asking to help reduce this code. How many json models do most people have here? I guess 2 at the minimum, a frontend and a mongo. We are actually investigating elastic search, as its schema less as the flexi field schema of solr is still a friction for us.- We play app is very thin layered, as opposed to a service, dao, (like more traditional java apps). So far its been great, simple and thin. But is this very thin layer producing the above 2 issues we have. We kinda only have 2 layer, a implementation controller and the generic one. Are people still using the service, dao, model pattern because of familiarity or have they gone coast to coast/ model less and reverted back to more traditional stacked layers?- Testing. We found that speed is an issue for testing. So what we did is we only test mongo backed test on the generic controller code. All the upper layers we just mock the mongo results. How are people testing their app? We are using specs2 mockito trait
Thanks very much, it would be great if we can get some insights and feedback.
A few quick arguments:
Just to precise, when I began speaking about coast to coast, I never said it should be considered as a universal approach :D
It's just an idea saying: one single way of modelling/validating data (mainly ORM) isn't good because it doesn't fit so well in all cases. Sometimes, you don't need a model, sometimes, you need it because in terms of business, a static model is better than pure Json.
I just want people to think a few seconds before diving into case classes or DAO or ORM and ask themselves: "do I need a static model or can I just validate Json and transmit those data to my DB directly?" (naturally this is very practical in MongoDB)
I've also studied alternatives solutions in this article with Play Autosource:
http://mandubian.com/2013/06/11/play-autosource/
Mongo is the perfect target for this approach: you begin without modelling clearly but just manipulating blobs of Json. Then progressively, you add validations. Potentially, you become aware that you need a real static model so you create it and use it in your autosource. My idea is to provide a versatile and iterative approach to modelling business data...
So in this approach, the Autosource plays the role of the DAO but doesn't enforce you to create a static model immediately. You can create just if you need it.
I tend to think that data stored in the DB should be trustable (in most cases) so if you just need to read those data, no need to go through a model, let just throw Json to the network.
Concerning transformers for complex operations, I don't believe Json Reads fit very well: I created JsZipper which is far better as soon as you need to perform more complex manipulations:
- http://mandubian.com/2013/05/01/reactive-json-crafting-jszipper-reactivemongo-webservice/
- http://mandubian.com/2013/07/04/json-interpolation-pattern-matching/
Hi Pascal,Thanks for the great reply. more reply inline belowOn Fri, Aug 23, 2013 at 1:15 AM, Pascal Voitot Dev <pascal.v...@gmail.com> wrote:
A few quick arguments:No real arguments. I just want to make it clear, that we think coast to coast is great. It helped us get up to speed pretty fast, that is why I also put in "or we don't know yet". As we are venturing pretty unfamiliar/different territory. So learning from one another would be great.
Just to precise, when I began speaking about coast to coast, I never said it should be considered as a universal approach :D
It's just an idea saying: one single way of modelling/validating data (mainly ORM) isn't good because it doesn't fit so well in all cases. Sometimes, you don't need a model, sometimes, you need it because in terms of business, a static model is better than pure Json.
I just want people to think a few seconds before diving into case classes or DAO or ORM and ask themselves: "do I need a static model or can I just validate Json and transmit those data to my DB directly?" (naturally this is very practical in MongoDB)
Yes agreed, there is no silver bullets. We also felt at the start coast to coast is great, right now we are just re-evaluating it. We also want to know how others have done it. I can say for our project it gave us a good flexible start. Also a personal project of mine, which only had 3 models/collection it was perfect.
I've also studied alternatives solutions in this article with Play Autosource:
http://mandubian.com/2013/06/11/play-autosource/
Mongo is the perfect target for this approach: you begin without modelling clearly but just manipulating blobs of Json. Then progressively, you add validations. Potentially, you become aware that you need a real static model so you create it and use it in your autosource. My idea is to provide a versatile and iterative approach to modelling business data...
So in this approach, the Autosource plays the role of the DAO but doesn't enforce you to create a static model immediately. You can create just if you need it.Thanks for the autosource. I think its great and better than what we are doing. I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code. Its similar to what we have and what we want to evolve the code to. You are just way ahead of us. So I have decided to create a branch, yank our CRUD stuff... refactor our code. If the experiment is successful we will likely just use autosource.
We are also using angular.js, and deadbolt for security.I tend to think that data stored in the DB should be trustable (in most cases) so if you just need to read those data, no need to go through a model, let just throw Json to the network.
Yes I think so. However I think its with ORM, dao, etc pattern, where the object is the source of the truth and data is just an implementation thinking is still ingrained with us. Maybe its more of unfamiliarity rather than coast to coast is not good for our current problem set.
The autosource is pretty good though, start with modeless and use a type eventually. I will give it a try.Concerning transformers for complex operations, I don't believe Json Reads fit very well: I created JsZipper which is far better as soon as you need to perform more complex manipulations:
- http://mandubian.com/2013/05/01/reactive-json-crafting-jszipper-reactivemongo-webservice/
- http://mandubian.com/2013/07/04/json-interpolation-pattern-matching/Thanks for this, will try it out. Will try to autosource first.
Jun
Yes agreed, there is no silver bullets. We also felt at the start coast to coast is great, right now we are just re-evaluating it. We also want to know how others have done it. I can say for our project it gave us a good flexible start. Also a personal project of mine, which only had 3 models/collection it was perfect.
The problems of modelling with classes is that you need to create as many classes as views. It quickly leads to hundreds of classes which doesn't represent anything at all in terms of business! If you can keep one single real business class and do the rest with pure Json, it can be a good compromise!
Thanks for the autosource. I think its great and better than what we are doing. I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code. Its similar to what we have and what we want to evolve the code to. You are just way ahead of us. So I have decided to create a branch, yank our CRUD stuff... refactor our code. If the experiment is successful we will likely just use autosource.yes test it, it's really low-level and quite simple so it can be hacked, modified pretty easily!
Data is the truth IMO, not classes :DIn 5 years, nobody will care about your classes and you don't know the language that will be used but your data will still be there!
I prefer to consider data-centric architecture.But technically speaking, when you are in a compiled typesafe language and you need to perform some complex operations with your data, naturally your language forces you to go to static structure such as classes!
Hi Pascal,On Fri, Aug 23, 2013 at 7:13 PM, Pascal Voitot Dev <pascal.v...@gmail.com> wrote:
Yes agreed, there is no silver bullets. We also felt at the start coast to coast is great, right now we are just re-evaluating it. We also want to know how others have done it. I can say for our project it gave us a good flexible start. Also a personal project of mine, which only had 3 models/collection it was perfect.
The problems of modelling with classes is that you need to create as many classes as views. It quickly leads to hundreds of classes which doesn't represent anything at all in terms of business! If you can keep one single real business class and do the rest with pure Json, it can be a good compromise!
Yes this is true especially the problem space is mostly just data manipulation, rather than real computing.Thanks for the autosource. I think its great and better than what we are doing. I had a quick read on it, and quick skim of the ReactiveMongoAutoSource.scala code. Its similar to what we have and what we want to evolve the code to. You are just way ahead of us. So I have decided to create a branch, yank our CRUD stuff... refactor our code. If the experiment is successful we will likely just use autosource.yes test it, it's really low-level and quite simple so it can be hacked, modified pretty easily!Yes started hacking already your sample play app. Already integrated on my personal project, started to create a branch on our app.There are few things I saw which mainly a slightly different convention using with Angular and likely any other js framework that use data binding. On the sample app I noticed that a query for the whole collection is performed for create, update and delete operation (2 REST call). I have changed it a bit, so that query is only performed when the whole collection is loaded. create, update and delete only performed on single items (1 REST call). See diff below:I also added a crude error checking. I also override angular update method to be using PUT, this is also what you have done on the AutoSourceController. As I think before what was happening its calling $save and is POST-ing on / with an id. Which is create, but since its got an id it becomes effectively becomes an update.Because of this I also override for now the insert and update methods of ReactiveMongoAutoSourceController. For the update I changed it to be 204 instead of a 200 returning just the id. As if its a 200 with an id only, what happens is that angular will data-bind this data... which effectively deletes all the attributes aside from the id. On the insert, I now return whole model + id, instead of just the id only then just insert that whole object in the JS array. Returning the whole object is helpful, especially if the backend adds a few bits to the object such as createdOn, createdBy, etc.Hope this is helpful, will try to make a pull request if you are ok with this. I will also update the other sample app, I may need to ask help on getting other db backend working.
Data is the truth IMO, not classes :DIn 5 years, nobody will care about your classes and you don't know the language that will be used but your data will still be there!
I prefer to consider data-centric architecture.But technically speaking, when you are in a compiled typesafe language and you need to perform some complex operations with your data, naturally your language forces you to go to static structure such as classes!
Yes for a data centric app this is true.
--
implicit val fmt = Json.format[User]
Instead of writing Reader and Writer?
NumberFormatException: For input string: "So"
For more options, visit https://groups.google.com/d/optout.