Import management
The file import allows you to import a list from your computer or from your Mindbaz SFTP. If you don't know the login information for your SFTP, you can request it from customer service by clicking here.
This Help Center article will give you more information on the formats and encodings used by the file import
Program an import
To import a subscriber file, you can use the POST api/{idsite}/imports/scheduleNow call.
The parameters to specify in the url are as follows:
Name | Description | Type | Additional information |
---|---|---|---|
idsite | MindBaz site identifier | Integer | Required |
In the body of the request, the parameters to be passed in the form "application/json, text/json " are the following:
Name | Description | Type | Additional information |
---|---|---|---|
parameters | Basic parameters for creating an import ImportParameters | Mandatory |
The ImportParameters type is a structure containing the parameters for creating a subscriber import:
Name | Description | Type | Additional information |
---|---|---|---|
filename | Name of the file to import The file must be present on the ftp of your database (access to ask mindbaz support). 500 characters maximum. Note: You can use a subdirectory but this subdirectory must also exist in the directory "errors" | String | Mandatory |
startLine | Number of the first line to import (1 by default) | Integer | Mandatory |
columnNamesFirstLine | Column Headers on the first line | Boolean | None |
textQualifier | Text Identifier (" e.g.) 1 char max | String | Required |
columnDelimiter | column separator (; e.g.) 1 char max | string | Required |
rowDelimiter | Row Separator (e.g.) 10 characters max | String of characters | Mandatory |
codepage | File codepage (1252 or 65001 for ex) 1252 by default | Integer | None |
sheetName | Name of the excel sheet to import 255 carac max | String of characters | Mandatory |
keyDbFieldName | Name of the bdd field used as deduplication key. 255 carac max For ex : fld1 to use the email as key (default value) fld0 for the subscriber id fld43 for the customer id md5_fld1 for the fld1 encoded in md5 (to be activated by mindbaz support before) | Character string | Mandatory |
keepFld7 | Do not change subscriber state? true by default | Boolean | None |
insertEnabled | Allow new subscribers to be inserted? true by default | Boolean | None |
safeMode | Safe Mode Enabled: Empty or NULL values replace the error fields in the row. The row is still imported Disabled (default): The whole row is in error if at least one of its fields is in error. | ||
emailNotification | List of emails (separated by ;) that will receive the import report at the end. 500 characters maximum | String | Mandatory |
deleteFile | Delete file from ftp at end of import? false by default | Boolean | None |
columnMapping | Mapping of columns and fixed values on the database fields. If a column from the file is not imported, it should not be put in this table. Warning: Mandatory field | Collection of ImportMapping | Mandatory |
The ImportMapping type records the mapping of a column from the file to a field in the database. It also records the addition of a fixed value to a field in the database (this fixed value will be used for all rows in the file). This class is used in the ImportParameter class in an array containing the mapping of all the columns of the file to be imported.
Name | Description | Type | Additional information |
---|---|---|---|
dbFieldName | Name of the field in the database. 15 chars max. Ex : fld1 for the email fld2 for the date of first registration etc. Caution Mandatory field | Mandatory string | |
idMappingMode | Id of the mapping mode: 1 (by default) : overwrite (the file data overwrites the db data - except if the file data is null) 2 : keep (the file data is written only if the db data is null) Attention Required field | Integer | Required |
fileColumnName | Name of the file column that is mapped to dbFieldName. Cannot be used together with the fixedValue field. 255 carac max Caution Mandatory field if fixedValue is null Caution If the file has no header, use col0, col1, etc. as column name | String | Mandatory |
fixedValue | Fixed value mapped to dbFieldName. For a date, put in "dd/MM/yyyy" format. Cannot be used at the same time as the fileColumnName field. 1000 carac max Warning Mandatory field if fileColumnName is null | Mandatory string | |
format | Format (date format essentially) of the file column. To be filled in only for a mapping on a column (fileColumnName) of type DATE or LIST. Example of DATE format: "dd/MM/yyyy HH:mm:ss", "yyyy-MM-dd", "yyyy-MM-dd HH:mm:ss", ... For a LIST type, you have to choose between 2 formats: "ID" or "VALUE": ID : the data in the file for this column correspond to the id of the list (ex "0", "1" for the list "0:No,1:Yes"). VALUE : the data in the file for this column correspond to the values of the list (ex "No", "Yes" for the list "0:No,1:Yes") For the other types, this field must be null. 30 carac max | Character string | None |
Here is an example of importing the file "ExportMindBaz_21-06-2022.csv" with a mapping on 2 columns:
{
"parameters": {
"filename": "ExportMindBaz_21-06-2022.csv",
"startLine": 1,
"columnNamesFirstLine": true,
"textQualifier": null,
"columnDelimiter": "|",
"rowDelimiter": "\r\n",
"codepage": 0,
"sheetName": null,
"keyDbFieldName": "fld1",
"keepFld7": false,
"insertEnabled": true,
"safeMode": false,
"emailNotification": "serviceclient@mindbaz.com;mindbaz@mail.com",
"deleteFile": true,
"columnMapping": [
{
"dbFieldName": "fld158",
"idMappingMode": 1,
"fileColumnName": "ANIMAL_DATE_NAISSANCE",
"fixedValue": null,
"format": "dd/MM/yyyy HH:mm:ss"
},
{
"dbFieldName": "fld157",
"idMappingMode": 1,
"fileColumnName": "ANIMAL_NOM",
"fixedValue": null,
"format": null
}
]
}
}
Codeline
- C#
- Java
- Python
var client = new RestClient("https://api.mindbaz.com/api/{idSite}/imports/scheduleNow");
client.Timeout = -1;
var request = new RestRequest(Method.POST);
request.AddHeader("Authorization", "Bearer `TOKEN`");
request.AddHeader("Content-Type", "application/json");
var body = @"{
" + "\n" +
@" ""parameters"": {
" + "\n" +
@" ""filename"": ""ExportMindBaz_21-06-2022.csv"",
" + "\n" +
@" ""startLine"": 1,
" + "\n" +
@" ""columnNamesFirstLine"": true,
" + "\n" +
@" ""textQualifier"": null,
" + "\n" +
@" ""columnDelimiter"": ""|"",
" + "\n" +
@" ""rowDelimiter"": ""\r\n"",
" + "\n" +
@" ""codepage"": 0,
" + "\n" +
@" ""sheetName"": null,
" + "\n" +
@" ""keyDbFieldName"": ""fld1"",
" + "\n" +
@" ""keepFld7"": false,
" + "\n" +
@" ""insertEnabled"": true,
" + "\n" +
@" ""safeMode"": false,
" + "\n" +
@" ""emailNotification"": ""mail1@mindbaz.com;mail2@mindbaz.com"",
" + "\n" +
@" ""deleteFile"": true,
" + "\n" +
@" ""columnMapping"": [
" + "\n" +
@" {
" + "\n" +
@" ""dbFieldName"": ""fld158"",
" + "\n" +
@" ""idMappingMode"": 1,
" + "\n" +
@" ""fileColumnName"": ""ANIMAL_DATE_NAISSANCE"",
" + "\n" +
@" ""fixedValue"": null,
" + "\n" +
@" ""format"": ""dd/MM/yyyy HH:mm:ss""
" + "\n" +
@" },
" + "\n" +
@" {
" + "\n" +
@" ""dbFieldName"": ""fld157"",
" + "\n" +
@" ""idMappingMode"": 1,
" + "\n" +
@" ""fileColumnName"": ""ANIMAL_NOM"",
" + "\n" +
@" ""fixedValue"": null,
" + "\n" +
@" ""format"": null
" + "\n" +
@" }
" + "\n" +
@" ]
" + "\n" +
@" }
" + "\n" +
@"}";
request.AddParameter("application/json", body, ParameterType.RequestBody);
IRestResponse response = client.Execute(request);
Console.WriteLine(response.Content);
Unirest.setTimeouts(0, 0);
HttpResponse<String> response = Unirest.post("https://api.mindbaz.com/api/{idSite}/imports/scheduleNow")
.header("Authorization", "Bearer `TOKEN`")
.header("Content-Type", "application/json")
.body("{\r\n \"parameters\": {\r\n \"filename\": \"ExportMindBaz_21-06-2022.csv\",\r\n \"startLine\": 1,\r\n \"columnNamesFirstLine\": true,\r\n \"textQualifier\": null,\r\n \"columnDelimiter\": \"|\",\r\n \"rowDelimiter\": \"\\r\\n\",\r\n \"codepage\": 0,\r\n \"sheetName\": null,\r\n \"keyDbFieldName\": \"fld1\",\r\n \"keepFld7\": false,\r\n \"insertEnabled\": true,\r\n \"safeMode\": false,\r\n \"emailNotification\": \"mail1@mindbaz.com;mail2@mindbaz.com\",\r\n \"deleteFile\": true,\r\n \"columnMapping\": [\r\n {\r\n \"dbFieldName\": \"fld158\",\r\n \"idMappingMode\": 1,\r\n \"fileColumnName\": \"ANIMAL_DATE_NAISSANCE\",\r\n \"fixedValue\": null,\r\n \"format\": \"dd/MM/yyyy HH:mm:ss\"\r\n },\r\n {\r\n \"dbFieldName\": \"fld157\",\r\n \"idMappingMode\": 1,\r\n \"fileColumnName\": \"ANIMAL_NOM\",\r\n \"fixedValue\": null,\r\n \"format\": null\r\n }\r\n ]\r\n }\r\n}")
.asString();
import http.client
import json
conn = http.client.HTTPSConnection("api.mindbaz.com")
payload = json.dumps({
"parameters": {
"filename": "ExportMindBaz_21-06-2022.csv",
"startLine": 1,
"columnNamesFirstLine": True,
"textQualifier": None,
"columnDelimiter": "|",
"rowDelimiter": "\r\n",
"codepage": 0,
"sheetName": None,
"keyDbFieldName": "fld1",
"keepFld7": False,
"insertEnabled": True,
"safeMode": False,
"emailNotification": "mail1@mindbaz.com;mail2@mindbaz.com",
"deleteFile": True,
"columnMapping": [
{
"dbFieldName": "fld158",
"idMappingMode": 1,
"fileColumnName": "ANIMAL_DATE_NAISSANCE",
"fixedValue": None,
"format": "dd/MM/yyyy HH:mm:ss"
},
{
"dbFieldName": "fld157",
"idMappingMode": 1,
"fileColumnName": "ANIMAL_NOM",
"fixedValue": None,
"format": None
}
]
}
})
headers = {
'Authorization': 'Bearer `TOKEN`',
'Content-Type': 'application/json'
}
conn.request("POST", "/api/100/imports/scheduleNow", payload, headers)
res = conn.getresponse()
data = res.read()
print(data.decode("utf-8"))
The file must exist on the sftp at the time the import is created. Otherwise you will get a 400 error with "Error, file not found".
The Import object
In return for the call, I get an object of type Import which contains the information of the newly created import. Make sure you get the id of the created import. It will be useful to follow its progress.
{
"success": true,
"data": {
"id":252,
"parameters":
{
"filename": "ExportMindBaz_21-06-2022.csv",
"startLine":1,
"columnNamesFirstLine":true,
"textQualifier":null,
"columnDelimiter": "|" ,
"rowDelimiter": "\r\n" ,
"codepage": 0,
"sheetName":null,
"keyDbFieldName": "fld1",
"keepFld7": false,
"insertEnabled": true,
"safeMode": false,
"emailNotification": "serviceclient@mindbaz.com;mindbaz@mail.com" ,
"deleteFile": true,
"columnMapping": [
{
"dbFieldName": "fld157" ,
"idMappingMode": 1,
"fileColumnName": "animal_nom",
"fixedValue": "",
"format": ""
},{
"dbFieldName": "fld158" ,
"idMappingMode": 1,
"fileColumnName": "animal_date_naissance" ,
"fixedValue": "",
"format": "dd/MM/yyyy HH:mm:ss"
}]
},
"creationDate": "2023-01-05T10:35:30.9983907+01:00" ,
"importState": 1,
"isDeleted": false,
"deletionDate":null,
"schedulingInfos":{
"isRecurring":false,
"recurringInfos":null,
"scheduledDate": "2023-01-05T10:37:30.8108812+01:00"
},
"lastImportLog":null
},
"error": null,
"typeName": "Import"
}
We find the input parameters and the mapping. In addition, we get the scheduling information:
- the schedule in schedulingInfos,
- the status of the import in importState (1 = scheduled)
- and the log of the last import in lastImportLog '(currently null because the import has not yet started).
The type ImportSchedulingInfos represents the scheduling information of an import job:
Name | Description | Type | Additional information | |||
---|---|---|---|---|---|---|
ScheduledDate | Scheduled execution date and time | Mandatory | ||||
isRecurring | Indicates if this is a recurring schedule | Boolean | None | |||
Indicates if this is a recurring schedule | Boolean | None | RecurringInfos | Recurring Information | RecurringInfos | None |
The RecurringInfos type represents the recurring information of a job:
Name | Description | Type | Additional Information |
---|---|---|---|
filenameTemplate | Template file name | String | None |
frequencyUnits | Frequency unit to qualify the property Frequency | EFrequencyUnit | None |
frequency | Frequency of execution (ex: Frequency=5, FrequencyUnits=Day means "every 5 days") | Integer | None |
monthDay | Day of the month programmed (ex: MonthDay=5 means "the 5th of every month") | Unsigned byte | None |
monthDay2 | Day of the month programmed (ex: first Monday, last working day...) | String of characters | None |
weekDays | Programmed day(s) of the week | Collection of unsigned bytes | None |
time | Time of day programmed | String | None |
startDate | Start date | None | |
endDate | Date | None |
The EFrequencyUnit type is of type Enum:
Name | Value | Description | |||
---|---|---|---|---|---|
Day | 1 | Frequency unit is in days | Week | 2 | Frequency unit is in weeks |
Week | 2 | Frequency unit is in weeks | |||
Monthly | 3 | The unit of frequency is in months |
Follow the status of my import
If you want to know the status of your import, you can use the api GET /api/{idsite}/imports/{idImport} with the id of the import you just created. You also get the same Import object in response.
The parameter we are interested in is the importState field (EImportState). The EImportState type is of type Enum:
Name | Value | Description |
---|---|---|
Scheduled | 1 | The import is scheduled |
InProgress | 2 | Import is in progress |
Completed | 3 | Import is complete |
Paused | 4 | Import is paused |
PackageErrorCreation | 5 | Error while creating the package |
If your import was misconfigured or if the mapping was done incorrectly, you will get a status=5 (PackageErrorCreation).
When the import is finished, you will get its execution log in the lastImportLog field (ImportLog). The ImportLog type represents the result information of an import execution:
Name | Description | Type | Additional information |
---|---|---|---|
id | Log ID | Integer | None |
filename | Name of the imported file | Character string | None |
startDate | Start date of the import | Date | None |
endDate | End date of import | Date | None |
errorMessage | Message in case of error | Character string | None |
totalRows | Total number of lines in the file | Integer | None |
totalDuplicateKeys | Number of duplicate lines according to the selected key | Integer | None |
nbValidations | Number of validated rows after validation | Integer | None |
nbErrorRows | Number of rows in error (with invalid values, emails, date format, string length...) | Integer | None |
nbBadFormattedRows | number of rows with column format problems (too many columns, column with wrong text e.g. "value; "value";) | ||
badFormattedRowNumbers | Number of rows in the file with bad format (1000 carac max) | Character string | None |
totalUpdates | Total number of lines to update | Integer | None |
nbUpdates | Total number of rows updated | Integer | None |
totalInserts | Total number of rows to insert | Integer | None |
nbInserts | Total number of rows inserted | Integer | None |
nbForbiddenInserts | Number of forbidden inserts (if option forbid inserts active or if there are duplicates in the file) | Integer | None |