-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BulkInsert - ERROR_34 #582
Comments
Hi. I suspect the problem is related to using SQL Express and hitting the DB Size Limit. It would be nice if a more specific error message could be return. Something like:
|
Hello @mhsimkin , The error This error is currently thrown by our library to handle a special case where, for example, a navigator has 2 times the same entity in his collections. This situation raises an error as we don't have to insert the entity twice. The error message indeed misses a better description. We will look to improve it. Best Regards, Jon |
Hi @JonathanMagnan, Thank you for the explanation. Is this a duplicate key issue? Is there anyway I can figure out which record(s) are causing the issue? Everything worked as expected, until I discovered an error in my configuration that resulted in the Recs collection not being persisted to the database. Once I fixed that issue, the Error 34 started to appear. -marc |
No, from what my developer told me is the same entity is found twice in the same list. Such as:
(I cannot currently test this scenario as I'm currently on vacation, but this is how I'm currently understanding my development explanation)
Not at this moment. We will try to look at it soon to provide more information. if you could create a runnable project with the issue, we will be happy to look at it and tell you the exact problem. It doesn’t need to be your project, just a new solution with the minimum code to reproduce the issue. Best Regards, Jon |
@JonathanMagnan, I will see what I can do. There are a few issues with trying to get a sample that will show the error:
If I can solve the first issue, than I should be able to get you something. I'm just concerned that skipping those initial records will result in the error not occurring. I don't think you want me to send you a data dump will 280,000 records. -marc |
Here is a working sample. I was able to dump the 5K block of records where the issue occurs into a json file. The same app reads that file and attempts to do the bulk insert. The SQL Connection string is hard coded in program.cs and the data is in dump.json. The app uses EF Code First to create the DB. I would not be surprised if the underlying issue is data related. I know the source data does have issues. The source system that updates the MongoDB database has been running for 10 years. The data model has changed a few times. Thank you for the help. -marc |
Thank you for the project. My developer will look into it. Best Regards, Jon |
Hello @mhsimkin , Thank you again for the project. The issue happens due to how is calculated an HashCode with a record. In your case, for this one Unlike a class, the HashCode is calculated with the properties value instead of an instance of a value. So if 2 differents record have the same exact value, they are considered equals since they have the same HashCode. In your case, it happens 6 times. One of them is for the So this error indeed happen as our library get Here is a simple code to see what is hapenning: var jsonData = File.ReadAllText("dump.json");
var recsFromFile = JsonSerializer.Deserialize<List<ActiveRecRecord>>(jsonData);
var recs = recsFromFile.SelectMany(x => x.RecommendationData.Recs).ToList();
var count1 = recs.Count();
var count2 = recs.Distinct().Count();
var duplicate = recs.Where(x => x.Ean == 9781504044424).ToList(); So I believe the issue is currently within the data as the 2 Let me know if that explain correctly the issue. Best Regards, Jon |
Hi @JonathanMagnan. I am on vacation this week. I will look at this after Memorial Day Weekend. Thank you. -marc |
@JonathanMagnan @mhsimkin - I am facing the same problem, our application has recently updated to use .NET 8 and as a result we have upgraded to the latest version of this library.
I have noticed that notifications is definitely all unique but the primary key (OptimeIndex in this case) is 0 for each simply because it hasn't been created yet.
This is our implementation of the BulkInsertAsync method. Any ideas how I can avoid this issue? |
Hello @jasonc2901 ,
It depends; in the case of @mhsimkin , this was not really an issue from our library (unless we missed something) but an issue within the data. The same item existed twice for different parents. To make it clear, it's like having an In your case, we don't know yet what caused the error In your case, it might be an issue through our library or, indeed, a case that should not happen. So what can you do?The best way to find out is just by making sure the count is equal to your distinct count. So you can find duplicate value by doing some LINQ like: var recs = recsFromFile.SelectMany(x => x.RecommendationData.Recs).ToList();
var count1 = recs.Count();
var count2 = recs.Distinct().Count();
if(count1 != count2)
{
// an error will happen in our library in this case, so you need to check i
var duplicateRecs = recs.GroupBy(x => x.Id)
.Where(x => x.Count() > 1)
.SelectMany(x => x)
.ToList();
} Then, you need to understand whether duplicate records are normal or not (a bug from our library or an issue with your data). If you can provide a runnable project, we will be happy to look at it. Best Regards, Jon |
Hi @JonathanMagnan, I'm catching up and reading your response about the duplicate key. Looking at this specific response. Given my data structure, it is highly probably that the same EAN will exist in multiple ActiveRecRecord.RecommendationData.Recs collections. This collection are not indexed as part of the data model. The RecommendationData property of the ActiveRecRecord is supposed to be persisted as Json. Here is the IEntityTypeConfiguration derived class.
The generated SQL Table DDL is:
-marc |
Thank you, Marc, for the additional information. I can definitely see some cases like your of
Best Regards, Jon |
@JonathanMagnan, no, you can't use the first entry found. The order in the collection is important and each occurrence could have a different score value. I don't know the internals of your method, but there seems to be some disconnect between what I expect and what your code is doing. The Cheers, -marc |
Hello @mhsimkin , I took some time to sit and complete this issue with my employee to better understand as I got too much back and forth on it by my fault. Starting with the v8.102.3, our library will not longer throw the From what my developer told me, that's exactly your scenario. Could you try it again and let me know if everything works now as expected? Best Regards, Jon |
@JonathanMagnan, just give me a couple of days. I will let you know how I make out. Thank you. -marc |
@JonathanMagnan, I can confirm that this has been resolved. Thank you. -marc |
Description
Received the following error message during BulkInsert
ERROR_34: Oops! the error occured in the bulk operations. Report this error to our support team: [email protected] , v=8.102.2.4
Exception
Fiddle or Project (Optional)
Hi:
I am getting the below error when I execute this code:
The class is defined to EF as:
The Recs property of the activeRecs object is List.
I have also attached a log file of all operations that occurred up until the error. Is there a way I can have the temp tables not deleted, Then I can send you the contents of the last temp table, assuming the error has something to do with an object in that table.
Thanks
Marc
Further technical details
operation.log
The text was updated successfully, but these errors were encountered: