Entity Framework without Persistance Ignorance
Here is an interesting discussion about this topic. Apperantly you will not be able to just take any object and persist it using the Entity Framework. You need to either inherit from a base class, or (in the future) implement a set of interfaces. I can't figure out what the reason for that is. It is not as if supporting POCO persistance ignorance is hard or complicate the framework (at least not compare to the issues of the persistance and querying themselves).
One of the goals for NHibernate is that you should be able to develop your application where only the controller has a reference to NHibernate, nothing else. This is quite important in a number of scenarios. From wanting to keep the model pure from persistance concerns (and be able to change ORM providers) to needing to do the more complex stuff with greater flexiblity.
One case that I had realy great success at was generating mapping at runtime for a set of objects that were returned from a web service. The way it worked was that I got the WSDL, generated the code, generated mapping, and started persisting, all at runtime. I had a naming convention that made sure that I would be able to query everything properly, and it made everything so much simpler to work with.
As an aside, I am looking for the documentation for the Entity Framework, preferably some overview + architecture. Any good sources (I know about the Linq examples for Entity Framework)?
Comments
Ayende says, " Here is an interesting discussion about this topic. Apperantly you will not be able
Sam Gentile mentioned the he and the CodeBetter guys are ripping into the Entity Block guys at the MVP Summit this week. They are going to introduce the team to NHibernate. =)
People are talking about Microsoft's Entity framework and how it does not currently allow persistence
One of the things that bugs me about whenever Microsoft makes an attempt at doing something like this is that the first response to critique involves mention of time frame restrictions. It's frustrating in this particular regard because they have been working on these ideas, in one form or another since at least early 2004 under the name Object Spaces, and possibly earlier (I did a cursory google search). I understand that technology changes as time passes, but it feels kind of insulting to be told that they are releasing something "[that] is less than ideal, but it's the reality that we live in."
First, none of this is NDA. Others discussing it are Ayende and Sam . Here's a quick rundown of how you
If I understand it correctly, the new Entity Framework is more 'Automated Persistence' than 'Transparent Persistence' (NHibernate).
NHibernate allows to have domain objects not linked to the persistence mechanisms, but the controllers have to be linked to NH to allow the use of ISession, IQuery, ... Isn't it annoying ? Why can't I query my persistence layer in an abstract way ? (of course, LINQ is coming ... bu no provider for NH yet)
On the other end, the Entity Framework forces you to inherit from a base class because there is an Abstract Persistence Layer, allowing to develop most of the functionalities (Query, ...) at an abstract level, while doing the actual storage using providers. Too bad that it seems overly complicated (multiple mapping files, new concepts, keywords ...).
Why not try to bridge the two approaches into one, that is to say, have an Abstract Persistence layer, that allows to work with entities in an abstract way, that doesn't require to write several mapping schemas (everything is inferred from classes, properties, attributes ... much like Castle AR is doing) and that provides value-added functionalities thanks to the abstract level (true in-memory bi-directional relations, notification on persistence events, ...), one of the persistence provider being NH ...
Somewhere in the application, you have to know how you are querying the database.
Trying to abstract that is a bad idea, since it goes the same path of trying to abstract the location of a service, and we know how that turned out.
I don't see the EF providing abstract querying support at all.
Something like what you propose isn't going to work. There is a big disparity between the various persistence approaches. Considering that Microsoft isn't really known for working with the community to provide a provider model that can satisfy even the 80% of the common ground, I just don't see it happening.
Take a look at the XyzDataSource troubles that exist now
MMhhhh ... I don't want to know that I am querying a 'database'. I want to retrieve data and save data. I don't care where they come from, where they are stored, neither how they are stored. This should be up to the configuration and providers' implementation. I want the Persistence Layer to manage local or distributed transactions 'transparently' if the data comes from one or more datasource(s).
This approach is already implemented in open & closed source frameworks (check Evaluant EUSS at http://euss.evaluant.com/).
Of course, this means that you have to settle a common ground for the different providers (and they may have to report their capabilities)
-> If you compare Persistence to Game Development, isn't what it is all about with DirectX? :)
I reverted back to NHibernate after trying Entity Framework (currently no 1-1 support) and DLINQ.
When I asked the team on the forum for DLINQ to create separate class files (the SQLMetal generates on large cs file with all your classes in it), the response was basically 'why, it's generated'.
This to me is no different than a typed dataset!
So, they aren't looking at these as Entities like we talk about Entities, these are just generated drones - and it looks very generated. The solution was 'use partial classes'.
I know, maybe I'm being picky, but I prefer NHibernate because I can build 'normal' classes, not inherited from anything, etc... PI as you say.
Now, I do think NHibernate deserves a bad knock for one thing: why isn't there a generator? I know it might contradict my above statements, but I think the reason people will end up using Microsoft's tool is because the time and effort to handcraft every entity, every mapping file is just a tremendous amount of work if you have at the very least a semi-large normalized database with a complex relationship.
I used Codus to get me started, it made a good start.
The documentation for NHibernate that comes with the latest release still has sections on it that talk about java. I had to double check to make sure I downloaded the correct binaries.
Why does this matter? (documentation, generator, etc...), to me, it shows it's polished. I know looks aren't everything, but Microsoft will win people over by looks, and a project as large as NHibernate needs to step back, and look at this.
I'm off subject now, but I think frameworks like NHibernate, Castle AR, etc... shouldn't give a back seat to the code generations - and Microsoft seems to fall into the opposite extreme - it's just a code generated mess - as always they give you twice as much code as you probably need.... LOL
One more quick comment:
the LINQ part of all of this is fantastic. And I hear that people are working at integrating/using LINQ with NHibernate. That is a good step.
There are quite a few generators for NHibernate. Gentle.Net, CodeSmith, MyGeneration, etc.
There isn't a single one, yes, and I don't think that there would ever be one.
About the documentation, please open JIRA issues about this, we will fix them.
You have to understand that NHibernate has a single full time developer (Sergey) and a bunch of volunteers, so some stuff takes time.
There is going to be a (great) book about NHibernate soon, which will hopefully feel much of the gap.
Ok, well, I hope to get this book.
On generators. Sorry, I haven't one yet that even produces correct mapping files.
I mean a GOOD tool. I'm not going to be apologetic about this part, I've downloaded at least 10 tools, etc... none are easy to use.
I understand it's open source, but just keep in mind, many OSS developers wonder why people adopt straight to Microsoft, and the primary reason I think is because MS typically makes things 'easy to use'. Whether right or wrong, or time, or resources, whatever, I'm just calling the kettle black...
Most of the generators are OSS as well, what was the problems that you have run into, and why not fix them and improve the quality in the process as well?
Just to make it clear, I understand what you mean, and I don't deny that there are things that OSS developers could do to make it better.
Persistence ignorance is as useless as POCO is as a term. A POCO class isn't persistable unless some post-compile or runtime-magic is added to the class.
This is often overlooked by some of the vocal 'POCO or bust' people, and it gets really annoying. Because, it really matters WHAT is added to the POCO class: O/R mapper A adds different magic than O/R mapper B.
(example:
myOrder.Customer = myCustomer;
in nhibernate, this doesn't mean that this is true:
myCustomer.Orders.Contains(myOrder)
however in some other o/r mappers it is)
So swapping O/R mappers will change the behavior of your application, no matter how hard you try to ignore the fact that this is reality. And then I don't even mention the reason for a Query object or other query specification method to formulate what you want to consume outside a repository.
So is this 'persistence ignorance' really existing? No. Not only is the database part of your application, if you want it or not, it also makes up a huge part of your application's execution time, so your application spends a lot of time inside the DB. Ignoring that doesn't make it go away. In fact, ignoring that makes you vulnerable for bad performing applications which you could have avoided with simple thinking and proper architectural decisions.
Another silly thing about this whole POCO myth is that a 'Customer' class isn't about the class. It's about the type it represents and what that type means, semantically.
Now, if you look at NIAM/ORM, and you look at a POCO entity model, they're at the same level. You can use NIAM/ORM to create entity representations in the DB (which isn't a 1:1 mapping, this for the NIAM impaired. See: http://www.orm.net) though you could also use them to produce entity classes for a POCO model.
So using an abstract model to produce your classes, what's really wrong with that? After all, it's not about technical details like code, it's about what these elements all mean. It really doesn't matter if you write 20% of the class and inherit the other 80% from a base class provided to you by a 3rd party or you write 100% of the functionality yourself.
That's not to say I don't like POCO, if someone wants to use POCO, go ahead. Just be honest about what happens and what's really important. I really tend to get fed up by some people who seem to think they know how it works and have to claim that if something doesn't use POCO it is crap.
After all, the core issue with this is that what POCO people really want is a place to write their own code without having to comply to a set of rules forced upon them by a 3rd party library they use the POCO classes with. If that can be solved, one way or the other, you've solved the problem.
If a POCO class ends up with 60% plumbing code for databinding, xml, helper code for producing strongly typed filters, prefetch paths etc. and 40% BL code, what's so great about having to write that 60% of the code by hand? Isn't that tying your code to some persistence framework as well?
I mean: dyn. proxy using POCO frameworks also force you to write your code in a given way. It's not 100% freedom anyway, and proper OO says that if you can inherit code from a supertype, why not use that ? Isn't that what's OO is all about?
I couldn't agree with Frans more, even if I'd written his words myself.
See my comments about it:
http://www.ayende.com/Blog/archive/2007/03/20/Plain-old-.Net-classes.aspx
Comment preview