Skip to content

Commit

Permalink
[TASK] Detect duplicate table records within importCSVDataSet()
Browse files Browse the repository at this point in the history
The testing-frameworks provides the ability to read csv fixture
files to import database datasets. The internal `DataSet::import()`
implementation is provided by `FunctionalTestCase->importCSVDataSet()`.

Sometimes duplicate rows with the same identifier (`uid` or `hash`)
are included in a csv dataset file which are overriden.

This change now checks during the dataseet import phase for these
duplicates and throw a exception stating the table and id value
to make the usage easier and helps with finding test-setup issues.

Note: `assertCSVDataSet()` counter part will not check for duplicates.

Resolves: #478
Releases: main, 7
  • Loading branch information
sbuerk committed Jul 28, 2023
1 parent ea9eb46 commit a5a60c6
Showing 1 changed file with 26 additions and 4 deletions.
30 changes: 26 additions & 4 deletions Classes/Core/Functional/Framework/DataHandling/DataSet.php
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ private function __construct(array $data)
*/
public static function import(string $path): void
{
$dataSet = self::read($path, true);
$dataSet = self::read($path, true, true);
foreach ($dataSet->getTableNames() as $tableName) {
$connection = GeneralUtility::makeInstance(ConnectionPool::class)->getConnectionForTable($tableName);
$platform = $connection->getDatabasePlatform();
Expand Down Expand Up @@ -99,9 +99,9 @@ public static function import(string $path): void
/**
* Main entry method: Get at absosulete (!) path to a .csv file, read it and return an instance of self
*/
public static function read(string $fileName, bool $applyDefaultValues = false): self
public static function read(string $fileName, bool $applyDefaultValues = false, bool $checkForDuplicates = false): self
{
$data = self::parseData(self::readData($fileName));
$data = self::parseData(self::readData($fileName), $fileName, $checkForDuplicates);
if ($applyDefaultValues) {
$data = self::applyDefaultValues($data);
}
Expand Down Expand Up @@ -199,7 +199,7 @@ private static function readData(string $fileName): array
* Special value treatment:
* + "\NULL" to treat as NULL value
*/
private static function parseData(array $rawData): array
private static function parseData(array $rawData, string $fileName, bool $checkForDuplicates): array
{
$data = [];
$tableName = null;
Expand Down Expand Up @@ -257,8 +257,30 @@ private static function parseData(array $rawData): array
unset($value);
$element = array_combine($data[$tableName]['fields'], $values);
if ($idIndex !== null) {
if ($checkForDuplicates && is_array($data[$tableName]['elements'][$values[$idIndex]] ?? false)) {
throw new \RuntimeException(
sprintf(
'DataSet "%s" containes a duplicate record for idField "%s.uid" => %s',
$fileName,
$tableName,
$values[$idIndex]
),
1690538506
);
}
$data[$tableName]['elements'][$values[$idIndex]] = $element;
} elseif ($hashIndex !== null) {
if ($checkForDuplicates && is_array($data[$tableName]['elements'][$values[$hashIndex]] ?? false)) {
throw new \RuntimeException(
sprintf(
'DataSet "%s" containes a duplicate record for idHash "%s.hash" => %s',
$fileName,
$tableName,
$values[$hashIndex]
),
1690541069
);
}
$data[$tableName]['elements'][$values[$hashIndex]] = $element;
} else {
$data[$tableName]['elements'][] = $element;
Expand Down

0 comments on commit a5a60c6

Please sign in to comment.