Skip to content

Reading Performance

Reading time series data

Zarr stores can be read in various ways. Here, however, we focus in using the Xarray library to read and load in-memory large time series over a single geographic location, as per the needs of PVGIS.

How does PVGIS 6 read external time series ?

An error occurred: name 'png' is not defined

Following are various measurements of the time it takes to read and load large time series in-memory, thus replicating the essential step of retrieving data in PVGIS.

Performance measurements

A single location

⏱ The following table presents the speed of reading complete time series over a single geographic location using ...

Chunk Sizes Compression Size Compression % Timing in ms 1 Timing 2 Timing 3 Throughput MB/s
350016 x 2 x 2 73G 52.6 ± 3.09 53.6 ± 1.11 51.4 ± 3.67
350016 x 4 x 4 73G 54.4 ± 1.06 50.9 ± 2.5 49.5 ± 3.63
350016 x 8 x 8 73G 53.8 ± 1.25 53 ± 2.34 49.4 ± 4.4
350016 x 16 x 16 76G 47.3 ± 2.19 49.3 ± 1.53 45.2 ± 3.54
350016 x 32 x 32 86G 53 ± 2.16 53.1 ± 2.51 49.5 ± 3.34
350016 x 64 x 64 86G 52.4 ± 3.07 48.7 ± 7.43 50.1 ± 3.18
350016 x 128 x 128 86G 49.4 ± 1.73 49.4 ± 1.26 47.7 ± 2.04
350016 x 256 x 256
350016 x 325 x 325
350016 x 512 x 512
350016 x 650 x 650
350016 x 2 x 2 zstd 1 16G 4.6 66.6 ± 2.33 67.3 ± 1.8 64.8 ± 2.83
350016 x 4 x 4 zstd 1 15G 4.9 66.2 ± 3.2 66.7 ± 2.47 64.1 ± 2.82
350016 x 8 x 8 zstd 1 14G 5.2 67.3 ± 1.78 67.6 ± 1.5 63.7 ± 2.49
350016 x 16 x 16 zstd 1 13G 5.8 66.2 ± 2.62 66.6 ± 1.86 65.3 ± 1.63
350016 x 32 x 32 zstd 1 13G 6.6 67.4 ± 1.32 66.8 ± 1.73 64.4 ± 3.33
350016 x 64 x 64 zstd 1
350016 x 128 x 128 zstd 1
350016 x 256 x 256 zstd 1
350016 x 325 x 325 zstd 1
350016 x 512 x 512 zstd 1
350016 x 650 x 650 zstd 1
10938 x 15 x 15 zstd 1 13G 67.7 ± 1.56 67.6 ± 1.95 65.1 ± 3.28

Alternative tabular presentation

Shape Chunk Sizes Compression Size Compression % Timing in ms 1 SD Timing 2 SD Timing 3 SD Throughput MB/s
1 x 2 x 2 350016, 2, 2 73G 52.6 ± 3.09 53.6 ± 1.11 51.4 ± 3.67
1 x 4 x 4 350016, 4, 4 73G 54.4 ± 1.06 50.9 ± 2.5 49.5 ± 3.63
1 x 8 x 8 350016, 8, 8 73G 53.8 ± 1.25 53 ± 2.34 49.4 ± 4.4
1 x 16 x 16 350016, 16, 16 76G 47.3 ± 2.19 49.3 ± 1.53 45.2 ± 3.54
1 x 32 x 32 350016, 32, 32 86G 53 ± 2.16 53.1 ± 2.51 49.5 ± 3.34
1 x 64 x 64 350016, 64, 64 86G 52.4 ± 3.07 48.7 ± 7.43 50.1 ± 3.18
1 x 128 x 128 350016, 128, 128 86G 49.4 ± 1.73 49.4 ± 1.26 47.7 ± 2.04
1 x 256 x 256
1 x 325 x 325 350016 x 325 x 325
1 x 512 x 512 350016 x 512 x 512
1 x 4 x 4 350016, 4, 4 zstd 1 15G 4.9 66.2 ± 3.2 66.7 ± 2.47 64.1 ± 2.82
1 x 8 x 8 350016, 8, 8 zstd 1 14G 5.2 67.3 ± 1.78 67.6 ± 1.5 63.7 ± 2.49
1 x 16 x 16 350016, 16, 16 zstd 1 13G 5.8 66.2 ± 2.62 66.6 ± 1.86 65.3 ± 1.63
1 x 32 x 32 350016, 32, 32 zstd 1 13G 6.6 67.4 ± 1.32 66.8 ± 1.73 64.4 ± 3.33
1 x 64 x 64 350016, 64, 64 zstd 1
1 x 128 x 128 350016, 128, 128 zstd 1
1 x 256 x 256 zstd 1
1 x 325 x 325 350016 x 325 x 325 zstd 1
1 x 512 x 512 350016 x 512 x 512 zstd 1
1 x 650 x 650 350016 x 650 x 650 zstd 1
10938 x 15 x 15 10938, 15, 15 zstd 1 13G 67.7 ± 1.56 67.6 ± 1.95 65.1 ± 3.28

Poor performance

Sometimes, a coincidence of unfavoring factors can lead to poor performance. This can be anything from a saturated number of preocessing requests, an inefficient queuing of processes by the scheduler of the operating system and more.

The following figures is one such bad example with measurements of the time to read the same chunk sizes listed above (i.e. Zarr stores with the same chunk size configuration)

mean ± SD 1 loops
117 ms ± 9.53 10
121 ms ± 3.57 10
144 ms ± 6.72 1
239 ms ± 7.39 1
678 ms ± 8.03 1
2.6 s ± 10.2 1
9.24 s ± 33.3 1
153 ms ± 5.66 10
178 ms ± 5.02 10
285 ms ± 9.98 1
773 ms ± 10.5 1
2.57 s ± 16.1 1
196 ms ± 13.3 10
Chunk Sizes Compression Size Compression % Timings in μs 2 SD
350016, 2, 2 73G 811 ± 678 ns
350016, 4, 4 73G 780 ± 576 ns
350016, 8, 8 73G 830 ± 776 ns
350016, 16, 16 76G 813 ± 609 ns
350016, 32, 32 86G 815 ± 751 ns
350016, 64, 64 86G 817 ± 6.35 μs
350016, 128, 128 86G 810 ± 4.03 μs
350016 x 256 x 256
350016 x 325 x 325
350016 x 512 x 512
350016 x 650 x 650
350016, 2, 2 zstd 1 16G 4.6 816 ± 680 ns
350016, 4, 4 zstd 1 15G 4.9 799 ± 16.4 μs
350016, 8, 8 zstd 1 14G 5.2 781 ± 615 ns
350016, 16, 16 zstd 1 13G 5.8 781 ± 627 ns
350016, 32, 32 zstd 1 13G 6.6 782 ± 768 ns
350016, 64, 64 zstd 1
350016, 128, 128 zstd 1
350016 x 256 x 256 zstd 1
350016 x 325 x 325 zstd 1
350016 x 512 x 512 zstd 1
350016 x 650 x 650 zstd 1
10938, 15, 15 zstd 1 13G 951 ± 573 ns

Multiple locations

The following experiments, each repeated twice, assess the reading performance over multiple geographic locations, though one after another and not in-parallel. They serve as evidence to support the claim that one of the most important factors that impacts reading performance, is the structural configuration of the time series data itself, independent of the location of a geographic location of interest in space (latitude, longitude).

Metadata of the experiment

  • Performed on 1 February 2025
  • Data retrieved from Zarr stores
  • Stored on an SSD readin at @ 560 MB / s
  • XFS File System with a block size of 4096 bytes

Scripted operations using timeit

with open('timing_read_operations_from_zarr_stores_again.txt', 'w') as f:
    for file in Path('.').glob('*.zarr'):
        dataset = xr.open_zarr(file)
        chunks = dataset.SIS.encoding['chunks']
        print(f"Dataset : {file},  Chunks : {chunks}", file=f)

        for longitude, latitude in zip(longitude_values, latitude_values):
            print(f' - {longitude=}, {latitude=}', file=f)

            # Measure the execution time for the .sel() command over 100 runs
            times = []
            for _ in range(100):  # Run the selection operation 100 times
                start_time = timeit.default_timer()  # Start the timer
                dataset.SIS.sel(lon=longitude, lat=latitude, method='nearest').load()  # Perform the selection
                elapsed_time = timeit.default_timer() - start_time  # Calculate elapsed time
                times.append(elapsed_time)  # Store the elapsed time

            median_time = np.median(times)  # Calculate the median of the recorded times
            print(f'Median time for {longitude=}, {latitude=}: {median_time:.4f} seconds for 100 runs', file=f)

        print(file=f)  # Print a blank line
Uncompressed data

Median reading time of location time series from uncompressed data

Experiment 1

Location16 2 x 23 4 x 44 8 x 85 16 x 166 32 x 327 64 x 647 128 x 1289
1 0.0197 0.0272 0.0755 0.1652 0.6059 2.3685 9.0944
2 0.0203 0.0275 0.0745 0.1651 0.6052 2.3602 8.4971
3 0.0205 0.0275 0.0748 0.1648 0.6056 2.3624 9.0870
4 0.0208 0.0274 0.0749 0.1651 0.6051 2.3623 9.0829
5 0.0207 0.0275 0.0745 0.1651 0.6056 2.3635 9.0871
6 0.0207 0.0275 0.0746 0.1652 0.6060 2.3604 8.4941
7 0.0203 0.0274 0.0745 0.1622 0.6054 2.3615 8.5074
8 0.0206 0.0275 0.0746 0.1553 0.6054 2.3619 8.4934
9 0.0204 0.0274 0.0749 0.1523 0.6058 2.3605 8.4927
10 0.0205 0.0274 0.0748 0.1500 0.6060 2.3596 9.0878
11 0.0205 0.0274 0.0747 0.1492 0.6052 2.3617 9.0769
12 0.0207 0.0274 0.0745 0.1488 0.6051 2.3617 8.3539
13 0.0206 0.0274 0.0746 0.1486 0.6053 2.3614 8.3724
14 0.0204 0.0274 0.0745 0.1485 0.6050 2.3628 8.4918
15 0.0204 0.0273 0.0746 0.1484 0.6057 2.3626 9.0837
16 0.0203 0.0275 0.0743 0.1651 0.6049 2.3621 9.0827
17 0.0203 0.0273 0.0746 0.1484 0.6054 2.3599 9.0760
18 0.0203 0.0274 0.0747 0.1488 0.6050 2.3595 8.3723
19 0.0203 0.0274 0.0748 0.1484 0.6052 2.3599 8.4934
20 0.0206 0.0274 0.0746 0.1484 0.6051 2.3623 9.0840
21 0.0202 0.0274 0.0746 0.1485 0.6046 2.3608 8.3722
22 0.0206 0.0275 0.0746 0.1483 0.5587 2.3617 8.3544
23 0.0204 0.0274 0.0746 0.1483 0.5343 2.3644 8.4756

Experiment 2

Location16 2 x 23 4 x 44 8 x 85 16 x 166 32 x 327 64 x 647 128 x 1289
1 0.0190 0.0286 0.0755 0.1819 0.6064 2.3736 9.0857
2 0.0203 0.0288 0.0728 0.1818 0.6057 2.3637 8.4932
3 0.0203 0.0288 0.0731 0.1671 0.6055 2.3655 9.0849
4 0.0203 0.0288 0.0740 0.1665 0.6057 2.3657 9.0817
5 0.0203 0.0289 0.0735 0.1666 0.6059 2.3657 9.0812
6 0.0203 0.0289 0.0740 0.1662 0.6052 2.3666 8.4924
7 0.0204 0.0288 0.0739 0.1634 0.6055 2.3651 8.4958
8 0.0202 0.0288 0.0739 0.1563 0.6063 2.3630 8.4896
9 0.0202 0.0288 0.0733 0.1535 0.6059 2.3625 8.4916
10 0.0204 0.0289 0.0743 0.1513 0.6062 2.3612 9.0799
11 0.0206 0.0289 0.0736 0.1506 0.6055 2.3643 9.0824
12 0.0204 0.0289 0.0733 0.1498 0.6057 2.3648 8.3711
13 0.0205 0.0289 0.0747 0.1500 0.6058 2.3629 8.3736
14 0.0203 0.0288 0.0746 0.1500 0.6055 2.3609 8.5028
15 0.0205 0.0288 0.0744 0.1488 0.6061 2.3618 9.0809
16 0.0205 0.0289 0.0744 0.1655 0.6058 2.3638 9.0836
17 0.0196 0.0288 0.0744 0.1490 0.6055 2.3610 9.0770
18 0.0205 0.0288 0.0734 0.1494 0.6049 2.3637 8.3510
19 0.0202 0.0289 0.0746 0.1490 0.6051 2.3611 8.4743
20 0.0198 0.0289 0.0742 0.1496 0.6053 2.3628 9.0673
21 0.0202 0.0288 0.0737 0.1497 0.6055 2.3588 8.3543
22 0.0199 0.0289 0.0741 0.1499 0.5594 2.3576 8.3513
23 0.0201 0.0289 0.0740 0.1495 0.5352 2.3592 8.4799
Compressed data

Experiment 1

Location16 2 x 2[^2x2Z1] 4 x 4[^4x4Z1] 8 x 8[^8x8Z1] 16 x 16[^16x16Z1] 32 x 32[^32x32Z1] 10938 x 15 x 1515
1 0.0501 0.0797 0.1880 0.5738 1.9890 0.1581
2 0.0493 0.0773 0.1840 0.5657 1.9727 0.1570
3 0.0503 0.0799 0.1869 0.5700 1.9885 0.1585
4 0.0502 0.0798 0.1872 0.5688 1.9897 0.1572
5 0.0494 0.0778 0.1853 0.5683 1.9818 0.1605
6 0.0503 0.0801 0.1877 0.5636 1.9895 0.1594
7 0.0499 0.0784 0.1846 0.5593 1.9579 0.1615
8 0.0492 0.0773 0.1824 0.5552 1.9714 0.1583
9 0.0493 0.0773 0.1828 0.5556 1.9712 0.1595
10 0.0493 0.0779 0.1841 0.5603 1.9848 0.1628
11 0.0504 0.0798 0.1880 0.5617 1.9857 0.1602
12 0.0499 0.0795 0.1877 0.5644 1.9856 0.1592
13 0.0493 0.0777 0.1825 0.5586 1.9790 0.1569
14 0.0495 0.0777 0.1834 0.5541 1.9629 0.1598
15 0.0503 0.0796 0.1872 0.5629 1.9818 0.1586
16 0.0493 0.0793 0.1872 0.5717 1.9868 0.1625
17 0.0498 0.0795 0.1875 0.5735 1.9912 0.1601
18 0.0496 0.0795 0.1878 0.5637 1.9794 0.1595
19 0.0487 0.0780 0.1840 0.5614 1.9766 0.1624
20 0.0488 0.0779 0.1852 0.5615 1.9804 0.1576
21 0.0497 0.0798 0.1875 0.5676 1.9785 0.1607
22 0.0482 0.0772 0.1835 0.5579 1.9793 0.1565
23 0.0492 0.0796 0.1871 0.5647 1.9707 0.1601

Experiment 2

Location16 2 x 2[^2x2Z1] 4 x 4[^4x4Z1] 8 x 8[^8x8Z1] 16 x 16[^16x16Z1] 32 x 32[^32x32Z1] 10938 x 15 x 1515
1 0.0499 0.0810 0.1874 0.5621 1.9835 0.1583
2 0.0493 0.0780 0.1832 0.5585 1.9724 0.1581
3 0.0504 0.0808 0.1875 0.5603 1.9877 0.1610
4 0.0504 0.0808 0.1875 0.5626 1.9880 0.1607
5 0.0495 0.0789 0.1854 0.5598 1.9815 0.1573
6 0.0503 0.0811 0.1886 0.5622 1.9893 0.1577
7 0.0498 0.0792 0.1855 0.5553 1.9599 0.1580
8 0.0493 0.0781 0.1833 0.5556 1.9721 0.1592
9 0.0494 0.0781 0.1831 0.5564 1.9726 0.1597
10 0.0493 0.0788 0.1850 0.5615 1.9864 0.1579
11 0.0504 0.0807 0.1877 0.5616 1.9870 0.1621
12 0.0500 0.0804 0.1884 0.5612 1.9870 0.1602
13 0.0494 0.0787 0.1839 0.5596 1.9800 0.1560
14 0.0494 0.0786 0.1840 0.5554 1.9643 0.1586
15 0.0503 0.0807 0.1880 0.5598 1.9830 0.1598
16 0.0494 0.0802 0.1878 0.5631 1.9880 0.1620
17 0.0498 0.0805 0.1878 0.5641 1.9923 0.1585
18 0.0504 0.0807 0.1888 0.5596 1.9807 0.1594
19 0.0495 0.0791 0.1853 0.5574 1.9777 0.1596
20 0.0497 0.0791 0.1856 0.5606 1.9825 0.1579
21 0.0504 0.0812 0.1872 0.5573 1.9805 0.1641
22 0.0491 0.0784 0.1844 0.5582 1.9824 0.1627
23 0.0500 0.0805 0.1876 0.5595 1.9714 0.1577

Using ipython's timeit magic

We repeat the same experiments and measure the average reading time in ms (unless stated otherwise) using ipython's timeit magic.

for file in Path('.').glob('*.zarr'):
    dataset = xr.open_zarr(file)
    chunks = dataset.SIS.encoding['chunks']
    print(f"Dataset : {file},  Chunks : {chunks}")
    for longitude, latitude in zip(longitude_values, latitude_values):
        print(f'- {longitude=}, {latitude=}')
        %timeit -r 100 dataset.SIS.sel(lon=longitude, lat=latitude, method='nearest').load()
    print()
Uncompressed data
Location16 3 ± 4 ± 5 ± 6 ± 7 ± 8 s ± 9 s ±
1 17.1 0.867 25.1 32 69.4 1.19 158 6.98 515 18.5 2 19.3 10.4 33.8
2 16.4 2.04 25.9 212 68.5 1.02 156 3.07 512 9.63 1.99 13.1 14.1 32.7
3 18 0.254 26 161 68.2 1.32 157 2.79 512 10.5 1.99 13.4 13.2 27.2
4 18.1 0.119 26 763 68.2 1.58 157 3.14 512 9.9 1.99 5.96 13.3 25.8
5 18.1 0.224 26 808 71.2 1.36 157 3.3 513 10 1.99 5.79 13.2 29.6
6 18 0.282 26.1 450 70.5 1.63 157 2.8 512 10.3 1.99 12.6 14.1 33.4
7 18 0.475 26.1 256 69.1 0.983 185 3.18 512 10.2 1.99 13.8 15.2 33.8
8 18 0.345 26.2 263 70.1 1.31 247 7.8 511 2.07 1.99 6.44 15.4 36.8
9 17.9 0.383 26.1 123 68.2 0.859 290 2.87 511 10.5 1.99 7.73 15.4 35.5
10 17.7 0.508 26.2 137 70 0.808 315 3.63 512 9.53 1.99 16.8 14.3 27.9
11 17.8 0.371 25.9 323 70.4 0.701 326 3.34 512 9.59 1.99 5.01 14.3 30.2
12 17.5 0.323 26 508 69.6 0.976 331 2.26 512 9.84 1.99 13 16.6 31.9
13 17.5 0.382 26.1 447 68.3 1.17 334 3.55 511 10 1.99 14.3 16.6 33.1
14 17.6 0.729 26.1 250 69 1.8 335 3.29 510 11.1 1.99 15.5 16.3 33.2
15 17.7 0.535 26.1 431 68.9 1 335 3.25 512 10.1 1.98 15.5 15.2 28.4
16 16.5 1.97 25.8 611 68.4 1.21 305 2.18 511 1.57 1.98 10.6 15.2 27.2
17 17.3 0.978 26 316 68.3 6.28 336 3.08 512 10.2 1.99 10.3 15.2 40.6
18 17.5 0.424 26 185 68.6 6.12 338 3.54 511 1.6 1.98 9.82 16.6 27.8
19 17.6 0.623 26.1 158 68.3 6.07 337 3.3 512 9.81 1.98 10.5 16.3 30.8
20 17.6 0.472 26.2 234 67.9 6.31 338 3.25 511 1.86 1.99 10.6 15.2 29.9
21 17.3 0.848 26.1 227 68 6.3 338 3.21 512 9.65 1.98 10.6 16.6 34.9
22 18 0.301 26.1 315 68.2 5.97 339 2.64 512 9.67 1.99 15.4 16.6 30.4
23 17.5 0.382 26.1 358 69.5 5.77 342 3.73 512 10.9 1.99 15.2 16.3 29.5
Compressed data
Location16 17 ± ⏱ [^4x4_ZSTD] ± ⏱ [^8x8_ZSTD] ± ⏱ [^16x16_ZSTD] ± ⏱ [^32x32_ZSTD] s ± 22 ±
1 44.2 0.738 75.5 2.45 249 6.33 650 37.8 2.79 13.4 140 8.46
2 43.3 1.22 71.2 2.14 244 1.07 682 11.4 2.8 4.88 141 7.11
3 43.6 0.833 74.8 1.62 247 1.37 710 9.09 2.81 3.51 145 6.9
4 43.4 0.889 74.5 2.03 247 1.39 728 19.7 2.82 5.14 144 8.3
5 43 0.861 74.4 1.81 245 1.12 732 8.41 2.8 22.5 146 9.09
6 43.9 0.749 75.1 1.77 246 3.3 760 6.96 2.79 4.48 142 6.88
7 43.4 0.834 74.9 1.57 245 1.45 585 8.62 2.78 4.56 145 8.11
8 42.8 0.727 73.3 1.17 242 1.76 636 5.26 2.8 3.4 142 8
9 42.6 0.962 73.2 1.04 176 1.9 677 5.64 2.81 9.68 146 10.3
10 43.5 0.512 72.6 0.789 178 2.4 720 7.35 2.84 61.8 145 8.95
11 44.5 0.559 75.6 1.46 181 1.89 748 5.16 2.81 3.98 148 7.87
12 44.2 0.518 76.2 1.54 181 2.86 580 9.99 2.74 5.29 149 8.1
13 43.5 0.550 75.2 0.984 177 1.69 621 10.2 2.78 5.1 144 8.88
14 42.9 0.730 74.9 1.05 177 2.09 664 9.16 2.78 12.3 146 9.37
15 43.6 1.41 76 1.11 181 2.12 695 7.74 2.78 4.97 149 7.55
16 43.6 0.429 76.6 0.784 180 2.01 700 6.07 2.8 4.15 147 8.47
17 44.1 0.706 76.2 1.09 180 2.09 732 7.66 2.8 6.34 149 7.06
18 44.4 0.612 77.2 0.749 181 2.13 568 6.07 2.8 4.73 148 7.95
19 43.6 0.633 75.6 1.67 178 2.14 610 8.15 2.8 6.42 146 6.87
20 43.7 0.578 75 0.945 178 1.81 653 8.68 2.8 4.52 145 7.79
21 44.5 0.497 76.1 0.842 180 1.95 676 8.35 2.79 4.71 143 9.39
22 43.2 0.506 75.2 0.554 177 1.78 721 8.85 2.82 4.47 147 7.79
23 44.3 0.461 76.8 0.784 181 2.38 742 16.4 2.81 5.51 144 9.33
Location Longitude Latitude
1 8.375 41.67499923706055
2 10.125 45.775001525878906
3 11.524999618530273 35.82500076293945
4 11.225000381469727 36.42499923706055
5 8.675000190734863 40.224998474121094
6 7.125 43.17499923706055
7 17.725000381469727 46.32500076293945
8 10.274999618530273 46.375
9 11.975000381469727 46.32500076293945
10 7.425000190734863 35.775001525878906
11 10.774999618530273 38.474998474121094
12 15.225000381469727 38.375
13 16.475000381469727 39.275001525878906
14 17.174999237060547 43.67499923706055
15 10.774999618530273 40.82500076293945
16 11.074999809265137 36.875
17 7.974999904632568 37.125
18 17.575000762939453 40.125
19 8.875 44.375
20 9.375 40.82500076293945
21 15.675000190734863 36.57500076293945
22 13.925000190734863 41.775001525878906
23 12.824999809265137 44.17499923706055

The source "timings" have been edited to make for a cleaner table. Example, some figures have been converted from s to ms and some from μs to ms.


  1. mean ± SD of 100 runs, 10 loops each 

  2. mean ± SD of 100 runs, 1000 loops each 

  3. Dataset : chunk_sizes_1_2_2.zarr, per loop (mean ± std. dev. of 100 runs, 100 loops each) 

  4. Dataset : chunk_sizes_1_4_4.zarr, per loop (mean ± std. dev. of 100 runs, 10 loops each) 

  5. Dataset : chunk_sizes_1_8_8.zarr, per loop (mean ± std. dev. of 100 runs, 10 loops each) for set of runs 1 to 17 and 1 loop each for set of runs 17 to 23

  6. Dataset : chunk_sizes_1_16_16.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  7. Dataset : chunk_sizes_1_32_32.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  8. Dataset : chunk_sizes_1_64_64.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  9. Dataset : chunk_sizes_1_128_128.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  10. Dataset : chunk_sizes_1_2_2_zstd_1.zarr, Chunks : (350016, 2, 2) 

  11. Dataset : chunk_sizes_1_4_4_zstd_1.zarr, Chunks : (350016, 4, 4) 

  12. Dataset : chunk_sizes_1_8_8_zstd_1.zarr, Chunks : (350016, 8, 8) 

  13. Dataset : chunk_sizes_1_16_16_zstd_1.zarr, Chunks : (350016, 16, 16) 

  14. Dataset : chunk_sizes_1_32_32_zstd_1.zarr, Chunks : (350016, 32, 32) 

  15. Dataset : chunk_sizes_10938_15_15_zstd_1.zarr, Chunks : (10938, 15, 15) 

  16. Location index 

  17. Dataset : chunk_sizes_1_2_2_zstd_1.zarr, Chunks : (350016, 2, 2) 

  18. Dataset : chunk_sizes_1_4_4_zstd_1.zarr, per loop (mean ± std. dev. of 100 runs, 10 loops each) 

  19. Dataset : chunk_sizes_1_8_8_zstd_1.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  20. Dataset : chunk_sizes_1_16_16_zstd_1.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  21. Dataset : chunk_sizes_1_32_32_zstd_1.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each) 

  22. Dataset : chunk_sizes_10938_15_15_zstd_1.zarr, per loop (mean ± std. dev. of 100 runs, 1 loop each)