When you process large data in your code, email attachments, for example, there is always a danger of running out of the available memory. Consider this code dealing with the email attachments:
using (OrganizationServiceContext crmContext = new OrganizationServiceContext(crmSvc)) { var attachments = from a in crmContext.CreateQuery("activitymimeattachment") select new { body = a.GetAttributeValue<string>("body") }; foreach (var att in attachments) { ProcessAttachment(att); } }
The danger here is running out of memory on retrieval of the file attached to the CRM notes. Even though from C# point of view everything is clear, the culprit here is OrganizationServiceContext that keeps all the data retrieved via LINQ queries in the context.
Possible solution to the problem include:
-
Use pagination and re-initialize context after a handful of pages processed. The main drawback is that the entire context is re-initialized, including the data you didn’t have to.
while (recordsExist) { pageNumber++; recordsExist = true; using (OrganizationServiceContext crmContext = new OrganizationServiceContext(crmSvc)) { var attachments = from a in crmContext.CreateQuery("activitymimeattachment") select new { body = a.GetAttributeValue<string>("body") }; foreach (var att in attachments .Skip(pageNumber) .Take(pageSize)) { recordsExist = true; ProcessAttachment(att); } } }
- Remove (detach) entities from context immediately after processing. Argument for the function is an Entity and therefore you won’t be able to use anonymous types in LINQ select which could somewhat affect the performance.
var attachments = from a in crmContext.CreateQuery("activitymimeattachment") select a; foreach (var att in attachments) { ProcessAttachment(att); crmContext.Detach(att, true); }