You have only the CLR 2.0, now go and build a complex application
Consider this scenario, where you need to build a complex application, and political issues says that you have to start from scratch. This means no access to any extrenal libraries (OSS or MS-Provided) ones.
Can you build a complex application that way?
The answer for me would be probably no. I know how I structure my applications today, and I am building them to use best practices approaches for maintainability and testability. There are a lot of tool support out there that I use in order to build my applications, not being able to use that would significantly cripple my ability to build applications.
I couldn't even write those frameworks from scratch, at least not in a reasonable time frame. A basic OR/M layer should take about a week to write, but it would be very basic, and wouldn't support any of the more powerful features that actually makes OR/M attractive (lazy loading & eager loading, for instance). A basic IoC takes two days to write, but it wouldn't allow interception and it wouldn't allow easy extensibility, and support for complex nested dependencies is going to be severly limited.
If you aren't working at that level, you may be able to roll out simple tools, and be successful, but I wouldn't be able to work within those limitation (I am often not able to work within the limitations of the current tools, at which point I extend them :-) ).
Anyway, just a point I wanted to make. Don't bother starting with a blank slate anymore, it is not worth it.
Comments
I think this is one area DSL Tools will make there mark as they can codegen from just the .NET Framework rather than using another library. OR/M may be a little to complex for that at the moment. I always try to use as few external libraries as possible even if it means using custom generation, as you get more specific performant code.
Mike,
I completely disagree, the DSL tools are good in concept, but the implementation is that it is trying to basically code-gen entire applications.
That is code duplication, I will push everything to a common framework and then I get the same benefit that I would from a software factory, but I wouldn't have to deal with nasty code duplication issues and I would have a better design.
About perf, I would like to see numbers first, you would be surprised how much work goes into making the "general" tool really performant.
I completely respect and understand your point of view, you should get the MVP just for speaking your mind on this blog and the debates that arise :)
Software Factories, for which I am not to exited about ( they are downright scary ), are for generating entire applications. DSL Tools on the other hand are for generating sections of a domain.
My only concern about DSL is moving forward to VS.NEXT the DSL no longer works and you have a problem even making a small change.
So if a DSL is generating a DAL with zero tool dependancy, is this really a problem. I know tools like NHibernate perform well, and hats off to the work there, but is this really a problem.
Take object proxies for example, and I have profiled them, they have performance impacts. My thinking is to have a little bloat / smell slightly larger memory footprint, provide the funtionality you need all generated by a DSL.
Now this starts leaking into another discussion about the basics needs of a domain object ( persistance, binding, validation, rules, authorization ) so we need 5 tools / frameworks?
Why not build this in using the framework?
I know this is a different approach and I totally respect your opinion on this.
Thanks for the great blog.
Yes I agree with Ayende here - if you're going to use a DSL, use only the DSL. DSLs typically generate code that is ugly.
BTW, I thought DSLs were used to model business logic (and possibly UI) & generate code from it? I do think codegen tools for domain models are useful.
If object proxies are useful enough, I'm sure MS will implement it into C# 4.0 and optimize the compiler for it. I think readability & maintainability are much more important considerations than performance in enterprise apps - if you're going to need performance, then have some sort of intermediate parser between your code and the compiler to optimize it (except I'm not sure whether such a beast exists yet).
Chris
I agree with most of the above. Although my personal development lately are very small projects, I can't live without TestDriven.Net or Reflector. I haven't used IoC, but I expect this is something I'm going to use from now on.
As opposed to to small group that can do whatever they want and keep the cost at a minimum, in bigger team there's a bigger marginal cost of all the tools out there. If everyone uses his favorite tool, other people need to also know it to support it.
There should be some kind of a balance there for the optimal point.
Chris,
Just because there are only a few DSL's available today and yes I agree most generate ugly code, thats not to say that the concept isin't a solid one. I have high hopes for activewriter, the best example of a DSL today in my opinion. DSL is young, and I see huge potential here. I am not saying never use a 3rd party or OSS library, just that good DSL Designers can eliminate some of this.
Gil,
I agree balance is needed here, I try not to run to a tool unless absolutely necessary.
One thing I dont really get, and this is slightly OT. When do you ever design an entity that does not require ( persistence, validation, rules, authorization, binding ) so why not build at least some of this this into the library? I am not talking about hardcoding rules or validation, simply adding the events and interfaces and services necessary to facilitate this, rather than doing POCO and tools.
Why not build in the basic plumbing and in the rare case you neeed more go grab the tool?
Mike,
My issue is that having a DSL generate the code still doesn't mean that you can ignore code duplication. I don't think that aspiring toward zero tools dependecies is even a good goal. It basically means that you have to do everything from scratch.
About object proxies, that really depends on how they work internall. To take Castle.DynamicProxy2, for instance, that one is highly optimized, I don't think you would be able to get better performance if you were writing it by hand.
@Chris,
Object proxies are extremely useful, they stand in the foundation of NHibernate, AoP, and a lot of other interesting ideas.
From my understanding, Andres is objecting to the idea.
Because it is more maintainable to put it in the tool in the first place. I dislike the idea of having to put duplicate code in my entities. I would rather build a framework that understood the way I worked than use the other approach.
Check my post about adding INotifyPropertyChanged to Nhibernate.
Ayende,
Thanks for the comments.
I reread that post and I get what your doing, yes it's very cool. Can you say that interception in this manner is more performant than simply implementing INotifyPropertyChanged? The best argument I can see for your approach is that dotnet databinding is so screwed up between asp.net winforms and wpf so you may have a good point there.
Now I am going to go read your latest p&p post, it's saturday and should be quite spicey :)
Mike,
As fast as implementing it directly? Sure, you get to write the code only once :-)
Seriously, it is probably costing another method call, but that it literally nothing in the grand scheme of things.
As always, measure and find out, but I'll willing to bet that there isn't a reasonable scenario where the interception would be the biggest issue.
Ahh I read my post and realised I could've written it better - I do use interception all the time - what I meant before should be taken in context of DSLs vs. frameworks in terms of performance.
Chris
The answer for me would me a mostly straightforward "yes." You make way too many assumptions about what "application" means -- granted, a large percentage of .NET applications are web- or CRUD-oriented, in which case tools such as the ones you advocate are invaluable.
However, there are a lot of application types that do not necessitate such tools (some examples: compilers, business engines, 3D engines, non-database server side software...), even though it sometimes helps.
Comment preview