CSV Importer tells me that duplicates were skipped
Hi,
I use the CSV importer to import bank transactions.
I did an import with 3 records, but the imported records don't show up in the bank account. First I thought it was might not have clicked on the Process button, tried the import again, but now get a message, that 3 duplicates have been skipped.
in the bank account I have selected to show all transactions. The 3 transactions have today's date, thus, should be on top of the screen.
I did a search by date, but nothing was found.
I even used the search and replace extension to search for the transaction by date, but again, none were found.
I changed the sorting from date to description then back to date again. But I still don't see the 3 transactions.
What else can I do in MD to find the existing transactions that are not being displayed?
Showing page 2 out of 2. View the first page
Keyboard shortcuts
Generic
| ? | Show this help |
|---|---|
| ESC | Blurs the current field |
Comment Form
| r | Focus the comment reply box |
|---|---|
| ^ + ↩ | Submit the comment |
You can use Command ⌘ instead of Control ^ on Mac
31 Posted by Gerd on 09 Sep, 2025 11:36 AM
here are 2 examples from Amazon Marketplace. The first time if found 3 similar payees, the next time it found 4.
32 Posted by Stuart Beesley ... on 09 Sep, 2025 12:08 PM
Sorry. No good. Please send me the raw data as requested. Header row and two data rows.
33 Posted by Gerd on 09 Sep, 2025 01:14 PM
I know how to get the raw data for the transaction, but how do I get it for the header?
And would you want a particular one, like amazon or the Tesco?
34 Posted by Stuart Beesley ... on 09 Sep, 2025 01:28 PM
In your csv file, line one is normally the header with column names. That one. Then the raw csv rows for the two data lines.
35 Posted by Gerd on 09 Sep, 2025 01:48 PM
Now I understand. You need the input file.
For the new test database I created, I only use OFX files for the import.
Even though I have the CSVs for the same periods those probably would not help you, as I don't use them for imports.
I have attached a small CSV that I had used before for import. Just in case it helps.
36 Posted by Stuart Beesley ... on 09 Sep, 2025 03:23 PM
Ok. I’m bowing out of this thread. The title says “ CSV Importer tells me that duplicates were skipped” and I’m referring to csv files and the csv importer. Which are clearly now not you are referring to.
37 Posted by Gerd on 09 Sep, 2025 06:02 PM
Hi Stuart,
before you give up on me, I just created a new database, and imported all the CSV but the last month. All those 2 1/2 years of transactions imported without an error.
But I just now imported the last month and got the error, 131 processed, 100 imported, 31 duplicate skipped. Same error like before.
I am going to send you a few records from the last CSV for your reference.
After I got the error, I selected all the records in the register that were imported from the last CSV, deleted the records, the reimported the very same file, without making any changes to the CSV, but all records were imported correctly.
I had the duplicate errors several times after I had created a new database and I wanted to find out where the error came from. I had done searches by the dates from the last CSV but did not find any errors, have looked for specific payees from the last month, did not find any, thus, had no idea why I had duplicates.
So, maybe with the CSV I uploaded you might get an idea why the error is triggered.
38 Posted by Stuart Beesley ... on 10 Sep, 2025 09:33 PM
The only way we can trouble shoot this is if you upload the whole file, or at least examples of raw records that fail. Your call.
39 Posted by Gerd on 14 Sep, 2025 08:40 PM
Sorry for my late reply, Stuart, and thanks for all the information and explanations.
After my database is nearly up to date, I won't be loading lots of transactions again. I just needed to start that new database with transactions going back to 2023.
All I have left is the few days from September. I check if those load without the error. If not, I will report back again, and we probably will need much less transaction to reproduce the error. But I hope this was a one off error, even though we don't have an explanation why it happened.