Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
127 commits
Select commit Hold shift + click to select a range
aa4340d
Moved readtsq to tdt_step2.py.
pauladkisson Nov 17, 2025
c868823
Moved import_np_doric_csv to np_doric_csv_step2.py.
pauladkisson Nov 17, 2025
a06cae4
Split import_csv out from import_np_doric_csv
pauladkisson Nov 17, 2025
66d60e2
Fixed TDT
pauladkisson Nov 17, 2025
4f4e1c9
Split import_doric out from import_np_doric_csv
pauladkisson Nov 17, 2025
341d77d
Removed unnecessary imports
pauladkisson Nov 17, 2025
0bcd4fe
Split import_npm out from import_np_doric_csv
pauladkisson Nov 17, 2025
7b36f64
Added modality selector to the GUI.
pauladkisson Nov 17, 2025
100ad14
Added modality selector to the GUI.
pauladkisson Nov 17, 2025
ef978ec
Added modality option to the api and tests
pauladkisson Nov 17, 2025
6589139
Removed intermediate np_doric_csv_step2 module.
pauladkisson Nov 18, 2025
e7ac4d8
Split tdt_step3.py off from read_raw_data.py.
pauladkisson Nov 18, 2025
2f57867
Hard-coded modality to simplify read.
pauladkisson Nov 18, 2025
092e1b7
Split doric_step3.py off from read_raw_data.py.
pauladkisson Nov 18, 2025
7abb8e0
Added check_doric to doric_step3.py.
pauladkisson Nov 18, 2025
b653538
Split csv_step3.py off from read_raw_data.py.
pauladkisson Nov 18, 2025
6d661c2
Added modality to Step 3.
pauladkisson Nov 18, 2025
a4f6583
Added tdtRecordingExtractor
pauladkisson Nov 19, 2025
882556e
Adapted parallel execute function to use new extractor.
pauladkisson Nov 19, 2025
df7b9e1
Added CsvRecordingExtractor for step 2
pauladkisson Nov 19, 2025
bcb78a5
Installed pre-commit.
pauladkisson Nov 19, 2025
1c8ee07
Added CsvRecordingExtractor for step 3
pauladkisson Nov 19, 2025
9262a5a
Added DoricRecordingExtractor for step 2
pauladkisson Nov 19, 2025
9c5afce
Added DoricRecordingExtractor for step 2
pauladkisson Nov 19, 2025
914f23f
Added DoricRecordingExtractor for step 3
pauladkisson Nov 19, 2025
cd966ae
streamlined inputs
pauladkisson Nov 19, 2025
ac158de
Added NpmRecordingExtractor for step 2
pauladkisson Nov 19, 2025
6a470a1
Added NpmRecordingExtractor for step 3
pauladkisson Nov 20, 2025
4903817
Merge branch 'modularization' into extractor
pauladkisson Nov 21, 2025
9b88cad
Add a tdt_check_data example session to the tests.
pauladkisson Nov 21, 2025
73e6a1c
Added event-splitting to tdt
pauladkisson Nov 21, 2025
a036090
Fixed event vs. new event bug.
pauladkisson Nov 21, 2025
7ecdf78
Fixed event vs. new event bug.
pauladkisson Nov 21, 2025
b87e79f
Refactored save_dict_to_hdf5 to compute event from S.
pauladkisson Nov 21, 2025
1192266
Peeled split_event_storesList from split_event_data.
pauladkisson Nov 21, 2025
9231f5f
updated logging.
pauladkisson Nov 21, 2025
ddf6ae5
Added high-level save
pauladkisson Nov 21, 2025
212c7c5
Added TODO
pauladkisson Nov 21, 2025
33682d2
Added multi-processing back in.
pauladkisson Nov 21, 2025
f84c550
Fixed test_step5.py for tdt_check_data
pauladkisson Nov 21, 2025
c55a230
Fixed test_step4.py for tdt_check_data
pauladkisson Nov 21, 2025
03ffd54
Renamed test_case from tdt_check_data to tdt_split_event.
pauladkisson Nov 21, 2025
27acc6c
Standardize read and save (#188)
pauladkisson Dec 2, 2025
a633550
Remove tkinter from NPM (#189)
pauladkisson Dec 3, 2025
d55bba7
Defined BaseRecordingExtractor.
pauladkisson Dec 3, 2025
1689b7e
Removed obsolete intermediates extractor steps
pauladkisson Dec 3, 2025
b35e04b
Refactored csv_recording_extractor to inherit from base_recording_ext…
pauladkisson Dec 3, 2025
b330a64
Refactored tdt_recording_extractor to inherit from base_recording_ext…
pauladkisson Dec 3, 2025
8af3b2b
Updated parameter names for saveStoresList.
pauladkisson Dec 3, 2025
5dc6d78
Refactored npm_recording_extractor to inherit from base_recording_ext…
pauladkisson Dec 3, 2025
861e991
Refactored doric_recording_extractor to inherit from base_recording_e…
pauladkisson Dec 3, 2025
dd40cb4
Refactored doric_recording_extractor to use class method for events a…
pauladkisson Dec 3, 2025
4619964
Refactored Extractors to use class method discover_events and flags i…
pauladkisson Dec 4, 2025
beb585f
Refactored Extractors to use class method discover_events and flags i…
pauladkisson Dec 4, 2025
1b5e8ca
Added comment about discover_events_and_flags signature
pauladkisson Dec 4, 2025
2e38ee8
Removed unused quarks.
pauladkisson Dec 4, 2025
cdecf42
Refactored NpmRecordingExtractor to inherit from CsvRecordingExtractor.
pauladkisson Dec 4, 2025
d43670f
Updated TODO
pauladkisson Dec 4, 2025
cd245a1
Centralized read_and_save_all_events and read_and_save_event function…
pauladkisson Dec 4, 2025
7e69cc7
Removed redundant intermediate common_step3.py.
pauladkisson Dec 4, 2025
792e421
Added Claude code docs to gitignore.
pauladkisson Dec 5, 2025
60fa0bc
Pulled out analysis-specific functions and io_utils from preprocess.py.
pauladkisson Dec 5, 2025
eadb22f
Organized step 4 analysis functions into various conceptual sub-steps.
pauladkisson Dec 5, 2025
29d5f9a
Removed categorization comments.
pauladkisson Dec 5, 2025
a9a65ab
Removed redundant fns
pauladkisson Dec 5, 2025
37a2f8d
Removed redundant fns
pauladkisson Dec 5, 2025
1bb8de4
Peeled off read operations from timestamp_correction CSV function.
pauladkisson Dec 6, 2025
aa36e33
Inverted name check
pauladkisson Dec 11, 2025
2049c4a
Refactored out write
pauladkisson Dec 11, 2025
8b50fb7
Refactored read and write out of timestampcorrection_tdt
pauladkisson Dec 12, 2025
b734170
Removed, now unused file path parameter.
pauladkisson Dec 12, 2025
4402cbb
Consolidated TDT and CSV timestamp correction functions into a singl…
pauladkisson Dec 12, 2025
ca735ce
Cleaned up some inefficient code
pauladkisson Dec 12, 2025
262681b
Pulled read operations out of the applyCorrection functions.
pauladkisson Dec 12, 2025
b6173dd
split up applyCorrection by ttl vs signal_and_control
pauladkisson Dec 13, 2025
4bfc1a7
Removed commented section.
pauladkisson Dec 13, 2025
b01a58f
Refactored applyCorrection inside timestampCorrection for signal and …
pauladkisson Dec 13, 2025
62cb84f
Pulled write operations back out of timestamp_correction.
pauladkisson Dec 13, 2025
36ba6b8
Pulled write operations out of applyCorrection_ttl.
pauladkisson Dec 15, 2025
05d855e
Move add_control_channel and create_control_channel to the control_ch…
pauladkisson Dec 15, 2025
1f65c14
Moved read and write to standard_io.py.
pauladkisson Dec 15, 2025
b628232
Moved read and write to standard_io.py.
pauladkisson Dec 15, 2025
90e838b
Removed unused functions after the refactor.
pauladkisson Dec 15, 2025
bf57616
Refactored artifact removal separate from z score
pauladkisson Dec 15, 2025
a03d018
Added artifact removal parameter back to execute_zscore.
pauladkisson Dec 15, 2025
e0a4ca8
Removed idle removeArtifacts parameter from compute z-score function.
pauladkisson Dec 15, 2025
44292ae
Streamlined remove artifact branch of the helper_z_score function.
pauladkisson Dec 15, 2025
6da97c0
Streamlined remove artifact branch of the helper_z_score function pt 2
pauladkisson Dec 16, 2025
d8bfcc0
Pulled remove_artifact code out of helper_z_score
pauladkisson Dec 16, 2025
b33c522
Pulled remove_artifact code into dedicated fn
pauladkisson Dec 16, 2025
e87c809
Pulled write code out of helper_z_score
pauladkisson Dec 16, 2025
cf73458
inverted input handling
pauladkisson Dec 16, 2025
7304fae
removed unnecessary parameters
pauladkisson Dec 16, 2025
965f62b
purified helper_z_score
pauladkisson Dec 16, 2025
c49d05f
purified z_score_computation
pauladkisson Dec 16, 2025
a88c026
purified helper_z_score
pauladkisson Dec 16, 2025
bf268f8
Refactored zscore to use a single high-level compute_zscore function …
pauladkisson Dec 16, 2025
4d49fd9
Refactored read-out of addingNaNValues
pauladkisson Dec 16, 2025
a80f080
Refactored read out of removeTTLs
pauladkisson Dec 16, 2025
1b2066d
Refactored read out of eliminateData and eliminateTs
pauladkisson Dec 16, 2025
7275b50
cleaned up addingNaNtoChunksWithArtifacts
pauladkisson Dec 16, 2025
07dcfa8
moved read to the top of addingNaNtoChunksWithArtifacts
pauladkisson Dec 16, 2025
8e03775
moved read out of addingNaNtoChunksWithArtifacts
pauladkisson Dec 16, 2025
7e25db9
Merge branch 'dev' into analysis
pauladkisson Dec 17, 2025
a87c507
fixed data read bug
pauladkisson Dec 17, 2025
b1cbc83
Refactored write operations out of addingNaNtoChunksWithArtifacts
pauladkisson Dec 17, 2025
393d3aa
Refactored filepath out of addingNaNtoChunksWithArtifacts
pauladkisson Dec 17, 2025
22f4f18
Renamed some variables in processTimestampsForArtifacts
pauladkisson Dec 17, 2025
a4a162f
Refactored read out of processTimestampsForArtifacts
pauladkisson Dec 17, 2025
a25e7ac
Refactored read out of processTimestampsForArtifacts
pauladkisson Dec 17, 2025
3c70579
Reorganized processTimestampsForArtifacts
pauladkisson Dec 17, 2025
b7d0549
Removed write from processTimestampsForArtifacts
pauladkisson Dec 17, 2025
61b2712
Removed write from processTimestampsForArtifacts
pauladkisson Dec 17, 2025
2dc18cc
Refactored filepath out of processTimestampsForArtifacts
pauladkisson Dec 17, 2025
bfb18e0
Consolidated write operations
pauladkisson Dec 17, 2025
d4f3de4
Consolidated into single remove_artifacts fn
pauladkisson Dec 17, 2025
c23aa1d
fixed bug with read_control_and_signal
pauladkisson Dec 17, 2025
1cda972
fixed naming bug in timestampCorrection
pauladkisson Dec 17, 2025
19986c8
Fixed combinedata bug
pauladkisson Dec 20, 2025
995b1e2
Fixed combinedata bug
pauladkisson Dec 20, 2025
d4ac68c
Reorganized into execute_combined_data and combine_data.
pauladkisson Dec 20, 2025
042fb33
Renamed some variables for clarity.
pauladkisson Dec 20, 2025
9db15aa
Refactored read operations out of eliminateData.
pauladkisson Dec 20, 2025
0109f83
Cleaned up some indentation in combine_data.
pauladkisson Dec 20, 2025
d3a8fbc
Refactored read operations out of eliminateTs.
pauladkisson Dec 20, 2025
c481d95
Refactored read operations out of eliminateTs.
pauladkisson Dec 20, 2025
ebe24b6
Refactored read operations out of eliminateTs.
pauladkisson Dec 20, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,5 @@ GuPPy/runFiberPhotometryAnalysis.ipynb
.clinerules/

testing_data/

CLAUDE.md
Empty file added src/guppy/analysis/__init__.py
Empty file.
222 changes: 222 additions & 0 deletions src/guppy/analysis/artifact_removal.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,222 @@
import logging

import numpy as np

logger = logging.getLogger(__name__)


def remove_artifacts(
timeForLightsTurnOn,
storesList,
pair_name_to_tsNew,
pair_name_to_sampling_rate,
pair_name_to_coords,
name_to_data,
compound_name_to_ttl_timestamps,
method,
):
if method == "concatenate":
name_to_corrected_data, pair_name_to_corrected_timestamps, compound_name_to_corrected_ttl_timestamps = (
processTimestampsForArtifacts(
timeForLightsTurnOn,
storesList,
pair_name_to_tsNew,
pair_name_to_sampling_rate,
pair_name_to_coords,
name_to_data,
compound_name_to_ttl_timestamps,
)
)
logger.info("Artifacts removed using concatenate method.")
elif method == "replace with NaN":
name_to_corrected_data, compound_name_to_corrected_ttl_timestamps = addingNaNtoChunksWithArtifacts(
storesList,
pair_name_to_tsNew,
pair_name_to_coords,
name_to_data,
compound_name_to_ttl_timestamps,
)
pair_name_to_corrected_timestamps = None
logger.info("Artifacts removed using NaN replacement method.")
else:
logger.error("Invalid artifact removal method specified.")
raise ValueError("Invalid artifact removal method specified.")

return name_to_corrected_data, pair_name_to_corrected_timestamps, compound_name_to_corrected_ttl_timestamps


def addingNaNtoChunksWithArtifacts(
storesList, pair_name_to_tsNew, pair_name_to_coords, name_to_data, compound_name_to_ttl_timestamps
):
logger.debug("Replacing chunks with artifacts by NaN values.")
names_for_storenames = storesList[1, :]
pair_names = pair_name_to_tsNew.keys()

name_to_corrected_data = {}
compound_name_to_corrected_ttl_timestamps = {}
for pair_name in pair_names:
tsNew = pair_name_to_tsNew[pair_name]
coords = pair_name_to_coords[pair_name]
for i in range(len(names_for_storenames)):
if (
"control_" + pair_name.lower() in names_for_storenames[i].lower()
or "signal_" + pair_name.lower() in names_for_storenames[i].lower()
): # changes done
data = name_to_data[names_for_storenames[i]].reshape(-1)
data = addingNaNValues(data=data, ts=tsNew, coords=coords)
name_to_corrected_data[names_for_storenames[i]] = data
else:
if "control" in names_for_storenames[i].lower() or "signal" in names_for_storenames[i].lower():
continue
ttl_name = names_for_storenames[i]
compound_name = ttl_name + "_" + pair_name
ts = compound_name_to_ttl_timestamps[compound_name].reshape(-1)
ts = removeTTLs(ts=ts, coords=coords)
compound_name_to_corrected_ttl_timestamps[compound_name] = ts
logger.info("Chunks with artifacts are replaced by NaN values.")

return name_to_corrected_data, compound_name_to_corrected_ttl_timestamps


# main function to align timestamps for control, signal and event timestamps for artifacts removal
def processTimestampsForArtifacts(
timeForLightsTurnOn,
storesList,
pair_name_to_tsNew,
pair_name_to_sampling_rate,
pair_name_to_coords,
name_to_data,
compound_name_to_ttl_timestamps,
):
logger.debug("Processing timestamps to get rid of artifacts using concatenate method...")
names_for_storenames = storesList[1, :]
pair_names = pair_name_to_tsNew.keys()

name_to_corrected_data = {}
pair_name_to_corrected_timestamps = {}
compound_name_to_corrected_ttl_timestamps = {}
for pair_name in pair_names:
sampling_rate = pair_name_to_sampling_rate[pair_name]
tsNew = pair_name_to_tsNew[pair_name]
coords = pair_name_to_coords[pair_name]

for i in range(len(names_for_storenames)):
if (
"control_" + pair_name.lower() in names_for_storenames[i].lower()
or "signal_" + pair_name.lower() in names_for_storenames[i].lower()
): # changes done
data = name_to_data[names_for_storenames[i]]
data, timestampNew = eliminateData(
data=data,
ts=tsNew,
coords=coords,
timeForLightsTurnOn=timeForLightsTurnOn,
sampling_rate=sampling_rate,
)
name_to_corrected_data[names_for_storenames[i]] = data
pair_name_to_corrected_timestamps[pair_name] = timestampNew
else:
if "control" in names_for_storenames[i].lower() or "signal" in names_for_storenames[i].lower():
continue
compound_name = names_for_storenames[i] + "_" + pair_name
ts = compound_name_to_ttl_timestamps[compound_name]
ts = eliminateTs(
ts=ts,
tsNew=tsNew,
coords=coords,
timeForLightsTurnOn=timeForLightsTurnOn,
sampling_rate=sampling_rate,
)
compound_name_to_corrected_ttl_timestamps[compound_name] = ts

logger.info("Timestamps processed, artifacts are removed and good chunks are concatenated.")

return (
name_to_corrected_data,
pair_name_to_corrected_timestamps,
compound_name_to_corrected_ttl_timestamps,
)


# helper function to process control and signal timestamps
def eliminateData(*, data, ts, coords, timeForLightsTurnOn, sampling_rate):

if (data == 0).all() == True:
data = np.zeros(ts.shape[0])

arr = np.array([])
ts_arr = np.array([])
for i in range(coords.shape[0]):

index = np.where((ts > coords[i, 0]) & (ts < coords[i, 1]))[0]

if len(arr) == 0:
arr = np.concatenate((arr, data[index]))
sub = ts[index][0] - timeForLightsTurnOn
new_ts = ts[index] - sub
ts_arr = np.concatenate((ts_arr, new_ts))
else:
temp = data[index]
# new = temp + (arr[-1]-temp[0])
temp_ts = ts[index]
new_ts = temp_ts - (temp_ts[0] - ts_arr[-1])
arr = np.concatenate((arr, temp))
ts_arr = np.concatenate((ts_arr, new_ts + (1 / sampling_rate)))

# logger.info(arr.shape, ts_arr.shape)
return arr, ts_arr


# helper function to align event timestamps with the control and signal timestamps
def eliminateTs(*, ts, tsNew, coords, timeForLightsTurnOn, sampling_rate):

ts_arr = np.array([])
tsNew_arr = np.array([])
for i in range(coords.shape[0]):
tsNew_index = np.where((tsNew > coords[i, 0]) & (tsNew < coords[i, 1]))[0]
ts_index = np.where((ts > coords[i, 0]) & (ts < coords[i, 1]))[0]

if len(tsNew_arr) == 0:
sub = tsNew[tsNew_index][0] - timeForLightsTurnOn
tsNew_arr = np.concatenate((tsNew_arr, tsNew[tsNew_index] - sub))
ts_arr = np.concatenate((ts_arr, ts[ts_index] - sub))
else:
temp_tsNew = tsNew[tsNew_index]
temp_ts = ts[ts_index]
new_ts = temp_ts - (temp_tsNew[0] - tsNew_arr[-1])
new_tsNew = temp_tsNew - (temp_tsNew[0] - tsNew_arr[-1])
tsNew_arr = np.concatenate((tsNew_arr, new_tsNew + (1 / sampling_rate)))
ts_arr = np.concatenate((ts_arr, new_ts + (1 / sampling_rate)))

return ts_arr


# adding nan values to removed chunks
# when using artifacts removal method - replace with NaN
def addingNaNValues(*, data, ts, coords):

if (data == 0).all() == True:
data = np.zeros(ts.shape[0])

arr = np.array([])
ts_index = np.arange(ts.shape[0])
for i in range(coords.shape[0]):

index = np.where((ts > coords[i, 0]) & (ts < coords[i, 1]))[0]
arr = np.concatenate((arr, index))

nan_indices = list(set(ts_index).symmetric_difference(arr))
data[nan_indices] = np.nan

return data


# remove event TTLs which falls in the removed chunks
# when using artifacts removal method - replace with NaN
def removeTTLs(*, ts, coords):
ts_arr = np.array([])
for i in range(coords.shape[0]):
ts_index = np.where((ts > coords[i, 0]) & (ts < coords[i, 1]))[0]
ts_arr = np.concatenate((ts_arr, ts[ts_index]))

return ts_arr
128 changes: 128 additions & 0 deletions src/guppy/analysis/combine_data.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
import logging
import os

import numpy as np

from .io_utils import (
decide_naming_convention,
read_hdf5,
write_hdf5,
)

logger = logging.getLogger(__name__)


def eliminateData(filepath_to_timestamps, filepath_to_data, timeForLightsTurnOn, sampling_rate):

arr = np.array([])
ts_arr = np.array([])
filepaths = list(filepath_to_timestamps.keys())
for filepath in filepaths:
ts = filepath_to_timestamps[filepath]
data = filepath_to_data[filepath]

if len(arr) == 0:
arr = np.concatenate((arr, data))
sub = ts[0] - timeForLightsTurnOn
new_ts = ts - sub
ts_arr = np.concatenate((ts_arr, new_ts))
else:
temp = data
temp_ts = ts
new_ts = temp_ts - (temp_ts[0] - ts_arr[-1])
arr = np.concatenate((arr, temp))
ts_arr = np.concatenate((ts_arr, new_ts + (1 / sampling_rate)))

return arr, ts_arr


def eliminateTs(filepath_to_timestamps, filepath_to_ttl_timestamps, timeForLightsTurnOn, sampling_rate):

ts_arr = np.array([])
tsNew_arr = np.array([])
filepaths = list(filepath_to_timestamps.keys())
for filepath in filepaths:
ts = filepath_to_timestamps[filepath]
tsNew = filepath_to_ttl_timestamps[filepath]
if len(tsNew_arr) == 0:
sub = tsNew[0] - timeForLightsTurnOn
tsNew_arr = np.concatenate((tsNew_arr, tsNew - sub))
ts_arr = np.concatenate((ts_arr, ts - sub))
else:
temp_tsNew = tsNew
temp_ts = ts
new_ts = temp_ts - (temp_tsNew[0] - tsNew_arr[-1])
new_tsNew = temp_tsNew - (temp_tsNew[0] - tsNew_arr[-1])
tsNew_arr = np.concatenate((tsNew_arr, new_tsNew + (1 / sampling_rate)))
ts_arr = np.concatenate((ts_arr, new_ts + (1 / sampling_rate)))

# logger.info(event)
# logger.info(ts_arr)
return ts_arr


def combine_data(filepath: list[list[str]], timeForLightsTurnOn, names_for_storenames, sampling_rate):
# filepath = [[folder1_output_0, folder2_output_0], [folder1_output_1, folder2_output_1], ...]

logger.debug("Processing timestamps for combining data...")

names_for_storenames = names_for_storenames[1, :]

for single_output_filepaths in filepath:
# single_output_filepaths = [folder1_output_i, folder2_output_i, ...]

path = decide_naming_convention(single_output_filepaths[0])

pair_name_to_tsNew = {}
for j in range(path.shape[1]):
name_1 = ((os.path.basename(path[0, j])).split(".")[0]).split("_")[-1]
name_2 = ((os.path.basename(path[1, j])).split(".")[0]).split("_")[-1]
if name_1 != name_2:
logger.error("Error in naming convention of files or Error in storesList file")
raise Exception("Error in naming convention of files or Error in storesList file")
pair_name = name_1

for i in range(len(names_for_storenames)):
if (
"control_" + pair_name.lower() in names_for_storenames[i].lower()
or "signal_" + pair_name.lower() in names_for_storenames[i].lower()
):
filepath_to_timestamps = {}
filepath_to_data = {}
for filepath in single_output_filepaths:
ts = read_hdf5("timeCorrection_" + pair_name, filepath, "timestampNew")
data = read_hdf5(names_for_storenames[i], filepath, "data").reshape(-1)
filepath_to_timestamps[filepath] = ts
filepath_to_data[filepath] = data

data, timestampNew = eliminateData(
filepath_to_timestamps,
filepath_to_data,
timeForLightsTurnOn,
sampling_rate,
)
write_hdf5(data, names_for_storenames[i], single_output_filepaths[0], "data")
pair_name_to_tsNew[pair_name] = timestampNew
else:
if "control" in names_for_storenames[i].lower() or "signal" in names_for_storenames[i].lower():
continue
filepath_to_timestamps = {}
filepath_to_ttl_timestamps = {}
for filepath in single_output_filepaths:
tsNew = read_hdf5("timeCorrection_" + pair_name, filepath, "timestampNew")
if os.path.exists(os.path.join(filepath, names_for_storenames[i] + "_" + pair_name + ".hdf5")):
ts = read_hdf5(names_for_storenames[i] + "_" + pair_name, filepath, "ts").reshape(-1)
else:
ts = np.array([])
filepath_to_timestamps[filepath] = tsNew
filepath_to_ttl_timestamps[filepath] = ts

ts = eliminateTs(
filepath_to_timestamps,
filepath_to_ttl_timestamps,
timeForLightsTurnOn,
sampling_rate,
)
write_hdf5(ts, names_for_storenames[i] + "_" + pair_name, single_output_filepaths[0], "ts")
for pair_name, tsNew in pair_name_to_tsNew.items():
write_hdf5(tsNew, "timeCorrection_" + pair_name, single_output_filepaths[0], "timestampNew")
Loading