Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Photometry integration #938

Draft
wants to merge 56 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
7295c07
Merge branch 'release/3.2.0' into develop
mayofaulkner Dec 18, 2024
4d208b4
add motion energy to LP task
themattinthehatt Dec 18, 2024
178e0f9
bugfix
themattinthehatt Dec 18, 2024
0b144d6
remove unnecessary comments
themattinthehatt Dec 18, 2024
d138737
flake
themattinthehatt Dec 18, 2024
c3962f7
allow none labels in plot insertions
oliche Dec 21, 2024
e81480b
rename spike sorting log
oliche Dec 21, 2024
558bde4
spike sorting log in alf collection for task signature
oliche Dec 21, 2024
6c37263
add waveform datasets to the spike sorting task signature
oliche Dec 31, 2024
ea32cae
spike sorting loader default switched to iblsorter
oliche Dec 31, 2024
f6b54c4
remove temporary folder before waveform extraction to clear some space
oliche Dec 31, 2024
1d0107c
add iblsorter log to the alf folder
oliche Dec 31, 2024
b9b074d
on GPU lock found, raise if task mode is set to raise
oliche Jan 1, 2025
1834891
patcher: add context to registration exception
oliche Jan 1, 2025
375c7c5
flake
oliche Jan 1, 2025
fed7b2e
check only size not hash for S3 re-runs
oliche Jan 3, 2025
d4b3bba
setup logger for iblsorter
oliche Jan 3, 2025
b520d88
update ibl-neuropixels requirement
oliche Jan 3, 2025
1d2401e
add waveform files to the output list
oliche Jan 3, 2025
1d5bd9a
remove duplicate files in output task
oliche Jan 4, 2025
f3f44ad
replace ricker function that has been removed in scipy
oliche Jan 6, 2025
19c5ea5
Merge pull request #901 from int-brain-lab/aws
oliche Jan 6, 2025
493d8bc
update environment for litpose task
mayofaulkner Jan 7, 2025
06b0478
updated requirements, updated pipelines
grg2rsr Jan 13, 2025
b4b4787
add in slices kwarg to pass into suite2p params
mayofaulkner Jan 16, 2025
bff5469
fibers are now named fiber_{brain_region} in the extraction process
grg2rsr Jan 17, 2025
362d590
add default n_ephys value to criterion delay
mayofaulkner Jan 20, 2025
9b962bd
unpack criteria
mayofaulkner Jan 20, 2025
bad825d
catch error for displaying info
mayofaulkner Jan 20, 2025
c519313
Merge branch 'develop' into litpose
oliche Jan 21, 2025
745f668
avoid dummy package for iblqt
oliche Jan 21, 2025
caa8562
Merge pull request #900 from int-brain-lab/litpose
oliche Jan 21, 2025
d49cb2b
plot to create a star plot
oliche Jan 22, 2025
7c9212d
most of the time it doesn´t make sense to have a different origin than 0
oliche Jan 22, 2025
399bd94
ensure sparse_npz files are registered
mayofaulkner Jan 27, 2025
27c5a36
Merge pull request #917 from int-brain-lab/starplot
oliche Jan 29, 2025
c2aecb6
One3 (#926)
oliche Jan 31, 2025
e34ae46
correct positions
mayofaulkner Feb 3, 2025
473236a
Support for UUID objects in Notes
k1o0 Feb 7, 2025
e45c63f
Merge branch 'develop' into mesoscope_multidepth
mayofaulkner Feb 12, 2025
f0c2ada
Merge branch 'develop' into docs
mayofaulkner Feb 12, 2025
b932bc9
Fix error in MesoscopeCompress when compressing multiply imaging bouts
k1o0 Feb 13, 2025
6a719c3
Selectively remove old FOV datasets in MesoscopePreprocess (issue #928)
k1o0 Feb 13, 2025
e8a4c1e
update queries for ONE 3
mayofaulkner Feb 17, 2025
74f578b
Merge pull request #936 from int-brain-lab/docs
mayofaulkner Feb 17, 2025
5d9d63b
Merge pull request #908 from int-brain-lab/mesoscope_multidepth
mayofaulkner Feb 17, 2025
d604865
Add PROJECT_EXTRACTION_VERSION and TASK_VERSION to session JSON (issu…
k1o0 Jan 14, 2025
c7a500a
Remove junk IDE folder
k1o0 Jan 23, 2025
fa6d722
Handle extraction of sessions where first Bpod trial missed on FPGA (…
k1o0 Jan 23, 2025
57f7164
fixing shifted sync timestamps in the extraction
grg2rsr Feb 24, 2025
b606290
reading digital inputs file via iblphotometry.io (validated)
grg2rsr Feb 24, 2025
9a9ff73
Update ONE version requirement
k1o0 Feb 28, 2025
be71131
Bump version; use ensure_list
k1o0 Mar 3, 2025
16ffbc4
Merge remote-tracking branch 'origin/develop' into photometry-integra…
grg2rsr Mar 4, 2025
8ae4039
fix for reextraction (.pqt file read instead of .csv) for digital_inp…
grg2rsr Mar 4, 2025
3fd7623
bugfix in the extractor after fix in the experiment description file
grg2rsr Mar 6, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 12 additions & 12 deletions brainbox/behavior/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -378,7 +378,7 @@ def get_training_status(trials, task_protocol, ephys_sess_dates, n_delay):
ephys_sess_dates])
n_ephys_trials = np.array([compute_n_trials(trials[k]) for k in ephys_sess_dates])

pass_criteria, criteria = criterion_delay(n_ephys, n_ephys_trials, perf_ephys_easy)
pass_criteria, criteria = criterion_delay(n_ephys_trials, perf_ephys_easy, n_ephys=n_ephys)

if pass_criteria:
status = 'ready4delay'
Expand Down Expand Up @@ -430,24 +430,24 @@ def display_status(subj, sess_dates, status, perf_easy=None, n_trials=None, psyc
f"{sess_dates[2]}]")
elif psych_20 is None:
print(f"\n{subj} : {status} \nSession dates={[x for x in sess_dates]}, "
f"Perf easy={[np.around(pe,2) for pe in perf_easy]}, "
f"Perf easy={[np.around(pe, 2) for pe in perf_easy]}, "
f"N trials={[nt for nt in n_trials]} "
f"\nPsych fit over last 3 sessions: "
f"bias={np.around(psych[0],2)}, thres={np.around(psych[1],2)}, "
f"lapse_low={np.around(psych[2],2)}, lapse_high={np.around(psych[3],2)} "
f"bias={np.around(psych[0], 2)}, thres={np.around(psych[1], 2)}, "
f"lapse_low={np.around(psych[2], 2)}, lapse_high={np.around(psych[3], 2)} "
f"\nMedian reaction time at 0 contrast over last 3 sessions = "
f"{np.around(rt,2)}")
f"{np.around(rt, 2)}")

else:
print(f"\n{subj} : {status} \nSession dates={[x for x in sess_dates]}, "
f"Perf easy={[np.around(pe,2) for pe in perf_easy]}, "
f"Perf easy={[np.around(pe, 2) for pe in perf_easy]}, "
f"N trials={[nt for nt in n_trials]} "
f"\nPsych fit over last 3 sessions (20): "
f"bias={np.around(psych_20[0],2)}, thres={np.around(psych_20[1],2)}, "
f"lapse_low={np.around(psych_20[2],2)}, lapse_high={np.around(psych_20[3],2)} "
f"\nPsych fit over last 3 sessions (80): bias={np.around(psych_80[0],2)}, "
f"thres={np.around(psych_80[1],2)}, lapse_low={np.around(psych_80[2],2)}, "
f"lapse_high={np.around(psych_80[3],2)} "
f"bias={np.around(psych_20[0], 2)}, thres={np.around(psych_20[1], 2)}, "
f"lapse_low={np.around(psych_20[2], 2)}, lapse_high={np.around(psych_20[3], 2)} "
f"\nPsych fit over last 3 sessions (80): bias={np.around(psych_80[0], 2)}, "
f"thres={np.around(psych_80[1], 2)}, lapse_low={np.around(psych_80[2], 2)}, "
f"lapse_high={np.around(psych_80[3], 2)} "
f"\nMedian reaction time at 0 contrast over last 3 sessions = "
f"{np.around(rt, 2)}")

Expand Down Expand Up @@ -997,7 +997,7 @@ def criterion_ephys(psych_20, psych_80, n_trials, perf_easy, rt):
return passing, criteria


def criterion_delay(n_ephys, n_trials, perf_easy):
def criterion_delay(n_trials, perf_easy, n_ephys=1):
"""
Returns bool indicating whether criteria for 'ready4delay' is met.

Expand Down
26 changes: 14 additions & 12 deletions brainbox/ephys_plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -439,20 +439,22 @@ def plot_brain_regions(channel_ids, channel_depths=None, brain_regions=None, dis
bar_kwargs.update(**kwargs)
color = col / 255
ax.bar(x=0.5, height=height, color=color, bottom=reg[0], **kwargs)
if label == 'right':
ax.yaxis.tick_right()
ax.set_yticks(region_labels[:, 0].astype(int))
ax.yaxis.set_tick_params(labelsize=8)
ax.set_ylim(np.nanmin(channel_depths), np.nanmax(channel_depths))
ax.get_xaxis().set_visible(False)
ax.set_yticklabels(region_labels[:, 1])
if label == 'right':
ax.yaxis.tick_right()
ax.spines['left'].set_visible(False)
else:
ax.spines['right'].set_visible(False)
ax.spines['top'].set_visible(False)
ax.spines['bottom'].set_visible(False)
if label is not None:
if label == 'right':
ax.yaxis.tick_right()
ax.set_yticks(region_labels[:, 0].astype(int))
ax.yaxis.set_tick_params(labelsize=8)
ax.set_ylim(np.nanmin(channel_depths), np.nanmax(channel_depths))
ax.get_xaxis().set_visible(False)
ax.set_yticklabels(region_labels[:, 1])
if label == 'right':
ax.yaxis.tick_right()
ax.spines['left'].set_visible(False)
else:
ax.spines['right'].set_visible(False)

if title:
ax.set_title(title)

Expand Down
2 changes: 0 additions & 2 deletions brainbox/examples/.idea/.gitignore

This file was deleted.

11 changes: 0 additions & 11 deletions brainbox/examples/.idea/examples.iml

This file was deleted.

This file was deleted.

7 changes: 0 additions & 7 deletions brainbox/examples/.idea/misc.xml

This file was deleted.

8 changes: 0 additions & 8 deletions brainbox/examples/.idea/modules.xml

This file was deleted.

6 changes: 0 additions & 6 deletions brainbox/examples/.idea/vcs.xml

This file was deleted.

7 changes: 5 additions & 2 deletions brainbox/io/one.py
Original file line number Diff line number Diff line change
Expand Up @@ -899,12 +899,13 @@ def load_spike_sorting_object(self, obj, *args, **kwargs):
self.download_spike_sorting_object(obj, *args, **kwargs)
return self._load_object(self.files[obj])

def get_version(self, spike_sorter='pykilosort'):
def get_version(self, spike_sorter=None):
spike_sorter = (spike_sorter or self.spike_sorter) or 'iblsorter'
collection = self._get_spike_sorting_collection(spike_sorter=spike_sorter)
dset = self.one.alyx.rest('datasets', 'list', session=self.eid, collection=collection, name='spikes.times.npy')
return dset[0]['version'] if len(dset) else 'unknown'

def download_spike_sorting_object(self, obj, spike_sorter='pykilosort', dataset_types=None, collection=None,
def download_spike_sorting_object(self, obj, spike_sorter=None, dataset_types=None, collection=None,
attribute=None, missing='raise', **kwargs):
"""
Downloads an ALF object
Expand All @@ -917,6 +918,8 @@ def download_spike_sorting_object(self, obj, spike_sorter='pykilosort', dataset_
:param missing: 'raise' (default) or 'ignore'
:return:
"""
if spike_sorter is None:
spike_sorter = self.spike_sorter if self.spike_sorter is not None else 'iblsorter'
if len(self.collections) == 0:
return {}, {}, {}
self.collection = self._get_spike_sorting_collection(spike_sorter=spike_sorter)
Expand Down
27 changes: 26 additions & 1 deletion brainbox/tests/test_behavior.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from pathlib import Path
import unittest
from unittest import mock
from functools import partial
import numpy as np
import pickle
import copy
Expand Down Expand Up @@ -250,7 +251,10 @@ def test_query_criterion(self):
'ready4ephysrig': ['2019-04-10', 'abf5109c-d780-44c8-9561-83e857c7bc01'],
'ready4recording': ['2019-04-11', '7dc3c44b-225f-4083-be3d-07b8562885f4']
}
with mock.patch.object(one.alyx, 'rest', return_value={'json': {'trained_criteria': status_map}}):

# Mock output of subjects read endpoint only
side_effect = partial(self._rest_mock, one.alyx.rest, {'json': {'trained_criteria': status_map}})
with mock.patch.object(one.alyx, 'rest', side_effect=side_effect):
eid, n_sessions, n_days = train.query_criterion(subject, 'in_training', one=one)
self.assertEqual('01390fcc-4f86-4707-8a3b-4d9309feb0a1', eid)
self.assertEqual(1, n_sessions)
Expand All @@ -267,3 +271,24 @@ def test_query_criterion(self):
self.assertIsNone(n_sessions)
self.assertIsNone(n_days)
self.assertRaises(ValueError, train.query_criterion, subject, 'foobar', one=one)

def _rest_mock(self, alyx_rest, return_value, *args, **kwargs):
"""Mock return value of AlyxClient.rest function depending on input.

If using the subjects endpoint, return `return_value`. Otherwise, calls the original method.

Parameters
----------
alyx_rest : function
one.webclient.AlyxClient.rest method.
return_value : any
The mock data to return.

Returns
-------
dict, list
Either `return_value` or the original method output.
"""
if args[0] == 'subjects':
return return_value
return alyx_rest(*args, **kwargs)
4 changes: 2 additions & 2 deletions examples/exploring_data/data_download.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -252,8 +252,8 @@
"metadata": {},
"outputs": [],
"source": [
"# Find an example session with data\n",
"eid, *_ = one.search(project='brainwide', dataset='alf/')\n",
"# Find an example session with trials data\n",
"eid, *_ = one.search(project='brainwide', dataset='_ibl_trials.table.pqt')\n",
"# List datasets associated with a session, in the alf collection\n",
"datasets = one.list_datasets(eid, collection='alf*')\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion ibllib/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
import logging
import warnings

__version__ = '3.2.0'
__version__ = '3.3.0'
warnings.filterwarnings('always', category=DeprecationWarning, module='ibllib')

# if this becomes a full-blown library we should let the logging configuration to the discretion of the dev
Expand Down
55 changes: 45 additions & 10 deletions ibllib/io/extractors/ephys_fpga.py
Original file line number Diff line number Diff line change
Expand Up @@ -806,8 +806,13 @@ def _extract(self, sync=None, chmap=None, sync_collection='raw_ephys_data',
fcn, *_ = ibldsp.utils.sync_timestamps(bpod_start, t_trial_start)
buffer = 2.5 # the number of seconds to include before/after task
start, end = fcn(self.bpod_trials['intervals'].flat[[0, -1]])
tmin = min(sync['times'][0], start - buffer)
tmax = max(sync['times'][-1], end + buffer)
# NB: The following was added by k1o0 in commit b31d14e5113180b50621c985b2f230ba84da1dd3
# however it is not clear why this was necessary and it appears to defeat the purpose of
# removing the passive protocol part from the final trial extraction in ephysChoiceWorld.
# tmin = min(sync['times'][0], start - buffer)
# tmax = max(sync['times'][-1], end + buffer)
tmin = start - buffer
tmax = end + buffer
else: # This type of alignment fails for some sessions, e.g. mesoscope
tmin = tmax = None

Expand Down Expand Up @@ -934,22 +939,50 @@ def build_trials(self, sync, chmap, display=False, **kwargs):
# Sync the Bpod clock to the DAQ.
# NB: The Bpod extractor typically drops the final, incomplete, trial. Hence there is
# usually at least one extra FPGA event. This shouldn't affect the sync. The final trial is
# dropped after assigning the FPGA events, using the `ifpga` index. Doing this after
# dropped after assigning the FPGA events, using the `ibpod` index. Doing this after
# assigning the FPGA trial events ensures the last trial has the correct timestamps.
self.bpod2fpga, drift_ppm, ibpod, ifpga = self.sync_bpod_clock(self.bpod_trials, fpga_events, self.sync_field)

if np.any(np.diff(ibpod) != 1) and self.sync_field == 'intervals_0':
bpod_start = self.bpod2fpga(self.bpod_trials['intervals'][:, 0])
missing_bpod_idx = np.setxor1d(ibpod, np.arange(len(bpod_start)))
if missing_bpod_idx.size > 0 and self.sync_field == 'intervals_0':
# One issue is that sometimes pulses may not have been detected, in this case
# add the events that have not been detected and re-extract the behaviour sync.
# This is only really relevant for the Bpod interval events as the other TTLs are
# from devices where a missing TTL likely means the Bpod event was truly absent.
_logger.warning('Missing Bpod TTLs; reassigning events using aligned Bpod start times')
bpod_start = self.bpod_trials['intervals'][:, 0]
missing_bpod = self.bpod2fpga(bpod_start[np.setxor1d(ibpod, np.arange(len(bpod_start)))])
t_trial_start = np.sort(np.r_[fpga_events['intervals_0'][:, 0], missing_bpod])
missing_bpod = bpod_start[missing_bpod_idx]
# Another complication: if the first trial start is missing on the FPGA, the second
# trial start is assumed to be the first and is mis-assigned to another trial event
# (i.e. valve open). This is done because the first Bpod pulse is irregularly long.
# See `FpgaTrials.get_bpod_event_times` for details.

# If first trial start is missing first detected FPGA event doesn't match any Bpod
# starts then it's probably a mis-assigned valve or trial end event.
i1 = np.any(missing_bpod_idx == 0) and not np.any(np.isclose(fpga_events['intervals_0'][0], bpod_start))
# skip mis-assigned first FPGA trial start
t_trial_start = np.sort(np.r_[fpga_events['intervals_0'][int(i1):], missing_bpod])
ibpod = np.sort(np.r_[ibpod, missing_bpod_idx])
if i1:
# The first trial start is actually the first valve open here
first_on, first_off = bpod_event_intervals['trial_start'][0, :]
bpod_valve_open = self.bpod2fpga(self.bpod_trials['feedback_times'][self.bpod_trials['feedbackType'] == 1])
if np.any(np.isclose(first_on, bpod_valve_open)):
# Probably assigned to the valve open
_logger.debug('Re-reassigning first valve open event. TTL length = %.3g ms', first_off - first_on)
fpga_events['valveOpen_times'] = np.sort(np.r_[first_on, fpga_events['valveOpen_times']])
fpga_events['valveClose_times'] = np.sort(np.r_[first_off, fpga_events['valveClose_times']])
elif np.any(np.isclose(first_on, self.bpod2fpga(self.bpod_trials['itiIn_times']))):
# Probably assigned to the trial end
_logger.debug('Re-reassigning first trial end event. TTL length = %.3g ms', first_off - first_on)
fpga_events['itiIn_times'] = np.sort(np.r_[first_on, fpga_events['itiIn_times']])
fpga_events['intervals_1'] = np.sort(np.r_[first_off, fpga_events['intervals_1']])
else:
_logger.warning('Unable to reassign first trial start event. TTL length = %.3g ms', first_off - first_on)
# Bpod trial_start event intervals are not used but for consistency we'll update them here anyway
bpod_event_intervals['trial_start'] = bpod_event_intervals['trial_start'][1:, :]
else:
t_trial_start = fpga_events['intervals_0']
t_trial_start = t_trial_start[ifpga]

out = alfio.AlfBunch()
# Add the Bpod trial events, converting the timestamp fields to FPGA time.
Expand Down Expand Up @@ -1000,10 +1033,12 @@ def build_trials(self, sync, chmap, display=False, **kwargs):
ind_err = np.isnan(fpga_trials['valveOpen_times'])
fpga_trials['feedback_times'][ind_err] = fpga_trials['errorCue_times'][ind_err]

out.update({k: fpga_trials[k] for k in fpga_trials.keys()})
# Use ibpod to discard the final trial if it is incomplete
# ibpod should be indices of all Bpod trials, even those that were not detected on the FPGA
out.update({k: fpga_trials[k][ibpod] for k in fpga_trials.keys()})

if display: # pragma: no cover
width = 0.5
width = 2
ymax = 5
if isinstance(display, bool):
plt.figure('Bpod FPGA Sync')
Expand Down
26 changes: 20 additions & 6 deletions ibllib/io/video.py
Original file line number Diff line number Diff line change
Expand Up @@ -151,12 +151,26 @@ def get_video_meta(video_path, one=None):


def url_from_eid(eid, label=None, one=None):
"""Return the video URL(s) for a given eid

:param eid: The session id
:param label: The video label (e.g. 'body') or a tuple thereof
:param one: An instance of ONE
:return: The URL string if the label is a string, otherwise a dict of urls with labels as keys
"""Return the video URL(s) for a given eid.

Parameters
----------
eid : UUID, str
The session ID.
label : str, tuple of str
The video label (e.g. 'body') or a tuple thereof.
one : one.api.One
An instance of ONE.

Returns
-------
str, dict of str
The URL string if the label is a string, otherwise a dict of urls with labels as keys.

Raises
------
ValueError
Video label is unreckognized. See `VIDEO_LABELS` for valid labels.
"""
valid_labels = VIDEO_LABELS
if not (label is None or np.isin(label, valid_labels).all()):
Expand Down
4 changes: 2 additions & 2 deletions ibllib/oneibl/data_handlers.py
Original file line number Diff line number Diff line change
Expand Up @@ -201,7 +201,7 @@ def filter(self, session_datasets, **kwargs):
Parameters
----------
session_datasets : pandas.DataFrame
An data frame of session datasets.
A data frame of session datasets.
kwargs
Extra arguments for `one.util.filter_datasets`, namely revision_last_before, qc, and
ignore_qc_not_set.
Expand Down Expand Up @@ -766,7 +766,7 @@ def setUp(self, **_):
:return:
"""
df = super().getData()
self.one._check_filesystem(df)
self.one._check_filesystem(df, check_hash=False)

def uploadData(self, outputs, version, **kwargs):
"""
Expand Down
5 changes: 4 additions & 1 deletion ibllib/oneibl/patcher.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,10 @@ def patch_dataset(self, file_list, dry=False, ftp=False, **kwargs):
return
# from the dataset info, set flatIron flag to exists=True
for p, d in zip(file_list, response):
self._patch_dataset(p, dset_id=d['id'], revision=d['revision'], dry=dry, ftp=ftp)
try:
self._patch_dataset(p, dset_id=d['id'], revision=d['revision'], dry=dry, ftp=ftp)
except Exception as e:
raise Exception(f'Error registering file {p}') from e
return response

def patch_datasets(self, file_list, **kwargs):
Expand Down
Loading
Loading