One of the most common architectures for web apps right now is based on passing DataTransferObjects(DTOs) to and from CRUD services that updates your business/domain entities using tools like AutoMapper and EntityFramework.
I will try to explain why this is a truly horrible approach.
Horrible from a technical perspective
Firstly, lets set up a scenario that we can reason about here:
The scenario here is an Order service with an Order entity and an OrderDTO.
Example code (pseudo code):
//entities class Order int Id //auto inc id List<Detail> Details class Detail int Id //auto inc id decimal Quantity string ProductId //DTO's class OrderDTO int Id List<DetailDTO> DetailsDTO class DetailDTO int Id decimal Quantity string ProductId
I’m pretty sure you have seen something similar to this before.
Lets also assume there is some sort of service that updates orders:
public void CreateOrUpdateOrder(OrderDTO orderDTO) { ... }
So what do we need to do in order to make this work?
Maybe you do something like this:
public void CreateOrUpdateOrder(OrderDTO orderDTO) { //this code would probably reside somewhere else, //in some app startup or config Mapper.CreateMap<OrderDTO,Order>(); Mapper.CreateMap<DetailDTO,Detail>(); //real code starts here var context = ... //get some DbContext var order = Mapper.Map<OrderDTO,Order>(orderDTO); context.OrderSet.Add(order); context.SaveChanges(); }
Looks fair enough?
This will work fine for a new order, the above code will map from DTOs to new entities and then add the new entities to the dbcontext and save the changes.
All is fine and dandy.
But what happens if we pass an existing order?
We will get into trouble because of the context.OrderSet.Add, existing orders will become new orders in the DB because we are always “adding”.
So what can we do about this?
Maybe we can do it like this?
public void CreateOrUpdateOrder(OrderDTO orderDTO) { //this code would probably reside somewhere else, //in some app startup or config Mapper.CreateMap<OrderDTO,Order>(); Mapper.CreateMap<DetailDTO,Detail>(); //real code starts here var context = ... //get some DbContext //try to fetch an existing order with the same Id as the DTO var order = context.OrderSet.FirstOrDefault(o => o.Id == orderDTO.Id); if (order == null) { //if no order was found, create a new and add it to the context order = new Order(); context.OrderSet.Add(order); } //Map the DTO to our _existing_ or new order Mapper.Map<OrderDTO,Order>(orderDTO,order); context.OrderSet.Add(order); context.SaveChanges(); }
Ah, that’s better, right?
Now we will not add new orders every time we try to update an existing one.
So, whats the problem now?
We still create new entities every time for each detail, AutoMapper knows nothing about EF and ID’s so it will simply recreate every object in the .Details collection.
So even if we can create and update the order entity, we are still messed up when dealing with the details.
Lets try again:
public void CreateOrUpdateOrder(OrderDTO orderDTO) { var context = ... //get some DbContext //let automapper be able to find or create the entity based on Id Mapper.CreateMap<OrderDTO,Order>() .ConstructUsing((OrderDTO orderDTO) => { if (orderDTO.Id == 0) { var order = new Order(); context.OrderSet.Add(order); return order; } return context .OrderSet.Include("Details") .First(o => o.Id == orderDTO.Id); }); //let automapper be able to find or create the entity based on Id Mapper.CreateMap<DetailDTO, Detail>() .ConstructUsing((DetailDTO detailDTO) => { if (detailDTO.Id == 0) { var detail = new Detail(); context.DetailSet.Add(detail); return detail; } return context .DetailSet.First(d => d.Id == detailDTO.Id); }); //AutoMapper will now know when to create a //new entity and when to fetch one from the DbContext Mapper.Map<OrderDTO,Order>(orderDTO); context.SaveChanges(); }
OK, now we are on the right track, we can update existing entities and create new entities when needed.
All of the magic is in the AutoMapper config.
Nice, right?
This will work fine for every Create or Update scenario.
Well, err.. it won’t handle deleted details. if a detain have been removed from the orderDTO detail collection, then we will have detail entities that are unbound in entity framework.
We have to delete those otherwise we will have a foreign key exception when calling SaveChanges.
OK, so lets see what we can do about this:
public void CreateOrUpdateOrder(OrderDTO orderDTO) { var context = ... //get some DbContext Mapper.CreateMap<OrderDTO,Order>() .ConstructUsing((OrderDTO orderDTO) => { if (orderDTO.Id == 0) { var order = new Order(); context.OrderSet.Add(order); return order; } return context .OrderSet.Include("Details") .First(o => o.Id == orderDTO.Id); }) //find out what details no longer exist in the //DTO and delete the corresponding entities //from the dbcontext .BeforeMap((dto, o) => { o .Details .Where(d => !dto.Details.Any(ddto => ddto.Id == d.Id)) .ToList() .ForEach(deleted => context.DetailSet.Remove(deleted)); }); Mapper.CreateMap<DetailDTO, Detail>() .ConstructUsing((DetailDTO detailDTO) => { if (detailDTO.Id == 0) { var detail = new Detail(); context.DetailSet.Add(detail); return detail; } return context.DetailSet.First(d => d.Id == detailDTO.Id); }); Mapper.Map<OrderDTO,Order>(orderDTO); context.SaveChanges(); }
Ah, sweet, now we can handle add, update and deletes for the order details.
This code will work fine in your unit tests..
But..
It will bite you in production.
Why?
Because we are binding the dbcontext using closures in the static AutoMapper mappings.
This code is not thread safe, if two requests arrive at the server at the same time, it will break, becuase both will have mappings bound to one of the active DbContexts.
AutoMapper’s static Mapper class is not threadsafe, we have to modify our code even further.
public void CreateOrUpdateOrder(OrderDTO orderDTO) { var config = new ... //create an instanced mapper instead of the static API var mapper = new AutoMapper.MapperEngine(config); var context = ... //get some DbContext config.CreateMap<OrderDTO,Order>() .ConstructUsing((OrderDTO orderDTO) => { if (orderDTO.Id == 0) { var order = new Order(); context.OrderSet.Add(order); return order; } return context.OrderSet.Include("Details").First(o => o.Id == orderDTO.Id); }) //find out what details no longer exist in the DTO and delete the corresponding entities //from the dbcontext .BeforeMap((dto, o) => { o .Details .Where(d => !dto.Details.Any(ddto => ddto.Id == d.Id)).ToList() .ForEach(deleted => context.DetailSet.Remove(deleted)); }); config.CreateMap<DetailDTO, Detail>() .ConstructUsing((DetailDTO detailDTO) => { if (detailDTO.Id == 0) { var detail = new Detail(); context.DetailSet.Add(detail); return detail; } return context.DetailSet.First(d => d.Id == detailDTO.Id); }); mapper.Map<OrderDTO,Order>(orderDTO); context.SaveChanges(); }
Now then?
Yeah, we solved it.
This will work in threaded environments and it does solve the add,modify and delete issues.
However, our code is now so bloated with AutoMapper configs that it is hard to tell what is going on here.
Even if this solves the use case here, I find this a horrible design, give this so called “simple design” to a rookie that is going to maintain this code for years to come and he will not have any idea what to do with it.
Pros of this design:
- Looks simple on paper
- Easy to implement on read side and client side
Cons:
- Bloody horrible to implement on the write side, and gets even worse the larger the DTO is
- Relies on magic names if using AutoMapper
- Network ineffective if dealing with large DTOs
- You lose semantics and intentions, there is no way to know WHY a change happened
But do note that this can be a valid design, IF, you use a document database or similar.
That is, if you don’t have a relational storage where you have to analyze what changes was made to the server data.
e.g. in a Rest API using MongoDB or such, but that has little to do with the key points in this article, which is auto mapping to ORMapper entities.
One of the biggest pain points in this specific case is actually AutoMapper, I think AutoMapper is a great tool, just not for this specific usecase.
If we remove automapper from the equation here. we could end up with some code that looks like this:
public void CreateOrUpdateOrder(OrderDTO orderDTO) { var ctx = ... //get some DbContext var order = ctx.OrderSet.FirstOrDefault(o => o.Id == orderDTO.Id); if (order == null) { order = new Order(); ctx.OrderSet.Add(order); } //Map properties order.Address = orderDTO.Address; //remove deleted details order.Details .Where(d => !orderDTO.Details.Any(detailDTO => detailDTO.Id == d.Id)) .Each(deleted => ctx.DetailSet.Remove(deleted)); //update or add details orderDTO.Details.Each(detailDTO => { var detail = order.Details.FirstOrDefault(d => d.Id == detailDTO.Id); if (detail == null) { detail = new Detail(); order.Details.Add(detail); } detail.Name = detailDTO.Name; detail.Quantity = detailDTO.Quantity; }); context.SaveChanges(); }
IMO, this is cleaner, a rookie should not have much problem understanding what is going on, possibly that the “Remove deleted details” part may cause some confusion.
But still, more readable than the first AutoMapper approach (And refactor friendly).
If I saw the above code in a real project, I wouldn’t be too bothered, it’s not a disaster, but do remember that it will get worse the larger and more complicated the DTO is.
Just imagine that the details have references to product entities, we soon have a big ball of “ripple load” where each detail iteration causes a load on product.
If you continue down this route, you will have a really hard to maintain code base and an angry DB breathing up your butt because the code is touching persisted objects in every iteration..
Conclusion:
AutoMapper is a great tool, but it is not built for mapping DTOs to O/R-Mapped entities. it lacks knowledge about things like dirty tracking, identity maps and orphan entities.
EntityFramework is what it is, many will bash it for being a naive mapper, I think it does it’s job in most cases even if it cant handle all the fancy mappings that NHibernate can.
The problem here is that most people knows little to nothing about how an ORM really works behind the scenes, and simply believes that you can slap a changed object graph onto the Context/Manager/Session/UoW and the mapper should magically figure out what the heck you did to that object. sorry but that’s not how it works.
If it did, it would have to do crap loads of queries against the database to compare the new state to the old state.
If you decide to use an advanced tool like an ORM, make sure you know how they work, ORM’s are insanely complex and they try to hide that complexity from you by simulating that you are working with a normal object graph, this is the true danger of ORMs, they make explicit concepts like a db-query implicit. you can’t see what is going on, you just touch objects and properties.
Horrible from a conceptual perspective
The design described in this post prevents you from doing Domain Driven Design, it is 100% impossible to use DDD if you are patching state directly into your entities from an DTO.
You are simply throwing state around w/o any safety net, your entities can go into all sorts of invalid states.
e.g. if you receive a “TraingleDTO” with all its angles set to 0 , oh well, just slap those numbers straight into your Triangle entity.. a total sum for all angles of a triangle that equals 0 is valid, right?
When I do domain modelling I want to capture the language and intentions of the domain experts.
If the domain experts say that you can add a product to an order, I want to see that somewhere in my code.
But more on that in the next post.
Well, this is all for now, I hope I managed to explain some of the pitfalls of this architecture.
In my next post I will be explaining why Command pattern and semantics matter.
Yes. This. Add IoC to the coctail and things can get even messier. It really is a fine balance to strike. Finding that line between “FRAMEWORK ALL THE THINGS” and just compromise and not go as far but do it maually.
Geez, all you gotta do is
public JsonResult UpdatePrelim(PrelimViewModel prelim)
{
var p = db.Prelims.First(x => x.PrelimId == prelim.PrelimId);
Mapper.Map(prelim, p);
db.SaveChanges();
return Json(new { Result = “OK” });
}
@damnmaxims, you did’t read the article did you?
because if you did, you would know that the problem is when using hierarchical objects, e.g. order+detail
The approach you show here _is included in the post here_ and as I state there, it only works for the top level object, e.g. in your case “prelim”, but it will break for any referenced sub objects..
Turns out that I’m not the only one thinking this is a huge problem.. https://entityframework.codeplex.com/workitem/864
Thank you Roger. I could not agree more. I’ve had to resort to plain old iteration through the child collections in the customer->order->orderDetail paradigm on the insert and updates. There is no free lunch with AutoMapper. I learned that the hard way. Could I ask you to briefly explain the linq statement you use for removing deleted details in your iteration example. I know WHY you do it, but the linq is confusing. Thank you again.
Totally agree with you here, especially the part where you make the distinction between working with a relational DB and a document-based DB, stating that it could indeed be a valid design in the latter case. Learned this the hard way after trying too hard to stick with what is considered good “relational design” with a NoSQL DB (which is dumb in the first place), so I can relate to this article but… the other way around.
Interesting article; thought provoking. I definitely don’t agree with every point you make [1], but you highlight the pain points with using Entity Framework.
One point I would make, don’t throw out the baby with the bathwater with respect to AutoMapper. It clearly has it’s limitations[2], but it still has its uses, even in your last code block. We can’t use it effectively to map relationships between models, but we can still use it to map the properties on the individual models.
I would use AutoMapper to set the the order’s address, and the detail’s name/quantity. In this example code, it would actually make the code a bit harder to follow. In the real world, where models commonly have dozens of properties, it can be a big time and frustration saver.
The DTO pattern is not a panacea. It is possible to force any operation into that mold, but that doesn’t mean you should. If you are adding “add a delivery address” functionality, there is no reason you should build the entire object tree in the DTO layer, add the address, then map that entire tree back down to your persistence layer. I have seen it done and it leads to madness.
However, DTO’s do have their place. If I am creating functionality that is effectively “allow the user to view and modify most any field in the object map,” then a DTO pattern can be ideal. This is especially the case if I want to separate the act of editing and saving. I can let the user modify the DTO to their heart’s content (within the rules and parameters of the domain), then map it all to the persistence layer when they hit save.
The problem specific to EF is that it is very difficult to use DB models directly in your view layer; detached objects are too hard to work with. As a result, developers need a place to put the boiler-plate code needed to re-associate their data, and they turn to their old friend the DTO.
[1] For instance, your “TriangleDTO” mini-rant. So, we are not allowed to have validations in DTO land now?
[2] I would say these limitations spring from problems within EF, but that is another rant.
Hi Dan.
My argument is not really against DTOs per se, but rather the fact that it gets problematic when mapping them back to the entity model.
There is no clean and easy way to deal with navigational properties.. primitive properties works fine, complex types (value objects) works fine in some cases, not at all in some cases.
If you have a persistence layer that can handle detached objects better, then there is no problem except that you lose semantics, but that might be OK in many cases.
1) you can absolutely do that, but it does remove the responsibillity of validation from the entity / domain layer and places it in data transfer objects which IMO is sub optimal..
2) yes, the title of the post is : “Why mapping DTOs to Entities using AutoMapper and __EntityFramework__ is horrible”
As a prefix, keep in mind I’m an experienced C++ developer but relatively new a C# and .Net. I may be getting some terms wrong. :)
Actually I had done something similar to AutoMapper before I found it and got it all working with Entity Framework. I’m now switching to AutoMapper because I like the paradigm (not to mention it’s probably better/faster than my implementation).
That said, to solve the Entity Framework issue, I used a slightly different approach which takes less code. Seems to work though I’m not sure about multi-threading issues. Perhaps you can comment?
Basically it goes like this:
– I have Details be a navigation property of Order
– I ensure the OrderId and Id in Detail is part of the primary key. This causes EF to delete the record when it is removed from the navigation property
– When it comes time to update Order, I load order in my method, then map the passed in DTO to the EXISTING Order.
– The maps are setup to copy the main attributes and then handle the collections via a method that will properly resolve existing items, add new items and remove missing items.
To make it work I set up the mappings as such (elsewhere, statically):
Mapper.CreateMap();
Mapper.CreateMap()
.ForMember(dest => dest.Details, opt => opt.Ignore())
.AfterMap((src, dst) =>
{
ListMap(src.Details, dst.Details);
});
Mapper.CreateMap();
Mapper.CreateMap();
The method ListMap is a generic method that handles the proper removal and mapping. You can specify which attributes to use as the “ID”. Uses reflection, and might be slow but it simplifies things.
public static void ListMap(ICollection from, ICollection to, string fromID = “ID”, string toID = “ID”) where TOutput : new()
{
if (from == null)
return;
foreach (TInput tIn in from)
{
object IDOfFrom = GetProp(tIn, fromID);
TOutput tOut = default(TOutput);
foreach (TOutput o in to)
{
object IDOfTo = GetProp(o, toID);
if (IDOfFrom.Equals(IDOfTo))
{
tOut = o;
break;
}
}
if (tOut == null)
tOut = new TOutput();
Mapper.Map(tIn, tOut);
to.Add(tOut);
}
foreach (TOutput tTo in to)
{
object IDOfTo = GetProp(tTo, toID);
bool found = false;
foreach (TInput tIn in from)
{
object IDOfFrom = GetProp(tIn, fromID);
if (IDOfTo.Equals(IDOfFrom))
{
found = true;
break;
}
}
if (!found)
{
to.Remove(tTo);
break;
}
}
}
The trick here is that with EF, if the primary key of an entity includes the foreign key, the row gets deleted when the item is removed from the list. Combined with loading the corresponding parent object first and everything seems to work. The DbContext used is part of the object being given to AutoMapper so I expect this would be thread safe.
Any comments? Am I missing something? Any criticism is appreciated.
Forgot the GetProp method:
public static object GetProp(object obj, string propName)
{
if (obj == null)
return null;
PropertyInfo pi = obj.GetType().GetProperty(propName);
if (pi == null)
return null;
//object prop = Activator.CreateInstance(pi.PropertyType);
return pi.GetValue(obj);
}
Sorry about the repeated posts, I forgot some more details.
My EF project is Model First which probably helps as my classes have awareness of linked entities and can reference them directly without doing separate lookups by using navigation properties.
The actual mapping would go like this:
var ctx = … //get some DbContext
var order = ctx.OrderSet.FirstOrDefault(o => o.Id == orderDTO.Id);
Mapper.Map(orderDTO, order);
I’ve moved away from automapper because of problems like this. Mapping objects is easy to do, we don’t need a magic tool to do it for us. Say no to automapper.
In case that DTO’s is used to transport data (entities) via HTTP to clients, I really don’t see why is using AutoMapper would be a cons.
Also one notice, above of EntityFramework (ORM) layer , is probably REST webservice which is by the book stateless. So every request/response is stateless operation so in that case there is no need to track Entities. In this architecture schema “Client HTTP WebService(REST) AutoMapper EF Relational Database”, I realy do not see reason to exclude AutoMapper.
@Johnnie, if you want to _update_ an entity, using automapper or manually, you _have to_ do your own change tracking.
You can not get around this by adding layers ontop of it, somewhere, you will have to deal with change tracking and find out what changed in your entity.
there are two cases where you can get around this:
1) if your entities doesn’t have any hierarchies, it’s just a flat object. then you are all fine.
2) you do append only inserts, never updating but rather adding a new version of the entity.
(Which is probably what you are referring to above)
For anyone who doesn’t agree, feel free to send a gist showing how you deal with updates of hierarchical entities in your codebase..
Roger,
I hear what you’re saying here with its simplicity and the problems it avoids. I have a question though, which is sort of back to the original premise, regarding this code.
//Map properties
order.Address = orderDTO.Address;
Am I missing something or do you do this for all properties? If you do it for all properties, that’s super error prone for model changes and isn’t there a way to automap that?
Nigel, Even I also wondering about this line of code. I don’t think it is well thought code.
@Nigel – The problem actually isn’t with the mapping its more of an Entity Framework issue (or how it works). This is mainly to do with detached entities so when you try to save them it wants to insert them and not update them (which cause primary key already exist issues). So you have to take steps to circumvent this issue by managing your own change tracking and handle all scenarios that deal with complex objects (And complex collections). As you can see from my GIST above I have I have all Domain and Data objects derive from a Base Class (BaseDomainModel and BaseDataModel). With this I can manage change tracking in my Domain Model (with SetAsModified) and use the base classes in my mapping configuration (both base classes also have an ID field). Also with collections you have to manage all scenarios such as Add/Update/Delete to make this work. Unlike the blog poster I went with a generic approach (EntityMapper) so for each Domain/Data Model i just need to create an instance of the mapper in my IoC and pass it to my repo (which all derive from DatabaseRepository). This makes mapping my DTO’s to data models super simple and requires no specific configuration for each repository like the blog poster. I hope this helps.
–James
Updated gist to include a few updates and added the IoC Code for others to see.
Thanks for the article. I had the same problem: need to map domain objects to EF ones in repositories. Your articles moved me to right direction as far as I can see.
Thanks for the articles. That moved me to right direction as far as I can see. I had the similar task: mapping of some domain objects graph to EF one in repositories.
Mapping continues horrible ?
And where is your next post “Why Entity Framework Blows”. Because it does horribly. You’re using the wrong stack bud! Get out of bloated .NET
This article describes well the problems in practise when trying to map EF entities to DTOs and then pass them back down the pipeline from UI to the repository for persistence. In fact, we had just about finished our DDD solution when the reality of getting the persistence to work across all use cases proved a lot harder to implement than we had imagined. It’s a wonderful pattern for read only repositories, but try passing a modified DTO and a couple of related property collections with modifications, additions and deletes and the complexity of getting to work is spectacularly complex.
EF(v6.1 in our case) makes hard work of this as it does not persist(add,update or delete) related entities re-attached to the DbContext – you have to do all the hard work, working out the changes to get the persistence to cascade. Isn’t this something that should be part of EF implementation out of the box? It really should be. Frankly, this was such a show stopper that we were prepared to walk away from the whole DDD pattern we had built around a repository pattern using EF.
GraphDiff was the silver bullet for our project and made this pattern work for us:
https://github.com/refactorthis/GraphDiff
GraphDiff is interesting and is pretty similar to the GIST I have above.
Thanks for the Article. I’m an NHibernate experienced programmer and now I’m doing a few projects with Entity framework. For me, this is a recurrent problem. I always avoid Automapper for the map from DTO to Entity. Sometimes i do this manually and other times i build interfaces like IMap<DetailDTO, Detail>
Now, your post inspired me to do a basic framework to handle this problem. My framework only handles the iteration of the details and leaves to the user the implementation of the insert, update, delete and map.
The final usage ends up like this:
// Definition of Mapper. (Should be created only once)
var orderDTOMapperBuilder = DetailMapperBuilder.Create<OrderDTO, Order>();
var itemsMap = orderDTOMapperBuilder.Detail((dto) => dto.Items, (e) => e.Items)
.WithDependencies<IItemRepository>( required : true)
.AddAction((ctx, item) => ctx.Dependencies.Insert(ctx.Master, item))
.DeleteAction((ctx, item) => ctx.Dependencies.Delete(item))
.CreateFunc((ctx) => new ItemOrder())
.EqualsFunc((itemDTO, item) => itemDTO.Id == item.Id)
.Build();
var orderDTO1 = /* Viene desde el cliente */;
var order1 = orderRepository.Find(orderDTO1.Id);
//
itemsMap.Map(orderDTO1, order1, itemRepository, (itemDTO, item) =>
{
// Mapping of internal properties (Could be set with AutoMapper)
item.ProductId = itemDTO.ProductId;
item.Quantity = itemDTO.Quantity;
});
Source Code:
https://github.com/jgquiroga/DetailMapper
Explanation (in spanish):
http://blog.jgqsolutions.com.ar/2015/12/detailmapper-mapping-desde-dtos.html
The code only has a couple of hours of development so it is in a very initial approach. I will appreciate any critics or comments.
Has anyone tries to use CodeDOM to generate the mapping ode between a DTO and a Domain Entity?
I personally like the generics method I posted above. I have it in 3 production applications and its working like a charm with little draw back.
https://github.com/TylerCarlson1/Automapper.Collection
This is my implementation to fix the problem extending off AutoMapper.
Handles the Insert Update Delete of the child, and sub-children.
Automapper.Collection.EntityFramework uses the PK’s for uniqueness comparison but you can overwrite them if you want to.
Only real problem is identifying Orphans.
https://github.com/TylerCarlson1/Automapper.Collection
This is what I created to solve the problem.
Use Persist and Delete calls which handle insert/update and delete statements for you.
Uses EF’s PK’s for uniqueness comparisons, or you can make your own.
Only issue is when deletes items it orphan’s entries, but there is a generic way to identify them in EF as per one of the issues.
Reverse it all? Treat EF’s classes as the DTOs. Use commands to update the domain objects. Automapper domain object –> EF object, then persist?
AutoMapper is not a perfect solution, but it saves an enormous amount of work and is pretty unobtrusive in practice. Contrary to the tone of the article, it better illuminates the deficiencies and problems of Entity Framework. I think this post is blaming the wrong tool.
Bug in code: If we have more new DetailDTOs then this code will not work! After inserting first DetailDTO, it will have Id == 0. When we are going to add another new DetailDTO (with Id == 0) method FirstOrDefault(d => d.Id == detailDTO.Id); will return previous Details and so on. (The Ids of new details will be generated after calling context.SaveChanges();)
So we have to use
var detail = detailDTO.Id == 0 ? null : order.Details.FirstOrDefault(d => d.Id == detailDTO.Id);
public void CreateOrUpdateOrder(OrderDTO orderDTO)
{
var ctx = … //get some DbContext
var order = ctx.OrderSet.FirstOrDefault(o => o.Id == orderDTO.Id);
if (order == null)
{
order = new Order();
ctx.OrderSet.Add(order);
}
context.SaveChanges();
}
How does this line work?
//Map properties
order.Address = orderDTO.Address;
What about using OrderCreateDto and use Automapper for create scenario and not to use AutoMapper for edit scenario?
Hi Roger,
I got the link to this article from the comment section on my blog post devoted to the same topic https://cezarypiatek.github.io/post/why-i-dont-use-automapper/
As a solution I developed the Visual Studio plugin based on the Roslyn that simulate the AutoMapper in design time. I think it can be very helpful in case you don’t use any mechanism of runtime automapping and you have to write all the mapping code by yourself. You can check the project on github https://github.com/cezarypiatek/MappingGenerator
Hi Roger,
I got the link to this article from the comment section on my blog post devoted to the same topic https://cezarypiatek.github.io/post/why-i-dont-use-automapper/
As a solution I developed the Visual Studio plugin based on the Roslyn that simulate the AutoMapper in design time. I think it can be very helpful in case you don’t use any mechanism of runtime automapping and you have to write all the mapping code by yourself. You can check the project on github https://github.com/cezarypiatek/MappingGenerator
Both Entity Framework and AutoMapper have received multiple enhancements since this article was written, making some of the objections to using them together less relevant. To offer an analogy, both hammers and screws are good for building houses, but driving a screw with a hammer is a bad way of using both. If there was such a fundamental problem with using EF and AM together, one or the other would probably have been dead long ago. If you separate calling EF from within the AM mapping, you’ll see the main problem is code organization. If you’re getting bad results, consider the blame may lie with the operator, not the tools.
This article is now 6 years old and I personally have not used neither AutoMapper or EF for a very long time.
What would be interesting, is if you give an example on how you could use the modern versions of the two tools for the above mentioned use-case.
e.g.
given a DtoOrder with DtoDetails inside, how do you inform the corresponding EF entity what rows are now deleted, updated or new.
This post is still getting a lot of reads, could you Ed provide an example how this is handled nowadays using these tools?
Regards
Roger