Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export to dict #586

Draft
wants to merge 24 commits into
base: main
Choose a base branch
from
Draft

Export to dict #586

wants to merge 24 commits into from

Conversation

samwaseda
Copy link
Member

Closes #576

I implemented what we have discussed in the issue above. Initially, I created an extra file called export.py, but then realized that I cannot import Composite there if I want to be able to export it from the workflow, so I simply put it inside composite.py. I was not so sure about the keys - you see a combination of label, scoped_label etc., but maybe there's a smarter way to tackle the problem.

Anyway for this workflow:

@Workflow.wrap.as_function_node
def add_one(a: int):
    result = a + 1
    return result

@Workflow.wrap.as_function_node
def add_two(b: int = 10) -> int:
    result = b + 2
    return result

@Workflow.wrap.as_macro_node
def add_three(macro, c: int) -> int:
    macro.one = add_one(a=c)
    macro.two = add_two(b=macro.one)
    w = macro.two
    return w

wf = Workflow("my_wf")

wf.three = add_three(c=1)
wf.four = add_one(a=wf.three)

wf.run()

wf.export_to_dict()

You can get the dict:

{'inputs': {'three__c': {'value': 1, 'type_hint': int}},
 'outputs': {'four__result': {'value': 5}},
 'nodes': {'three': {'inputs': {'three__c': {'value': 1, 'type_hint': int}},
   'outputs': {'three__w': {'value': 4, 'type_hint': int}},
   'nodes': {'one': {'inputs': {'a': {'value': 1, 'type_hint': int}},
     'outputs': {'result': {'value': 2}},
     'function': <function __main__.add_one(a: int)>},
    'two': {'inputs': {'b': {'default': 10, 'value': 2, 'type_hint': int}},
     'outputs': {'result': {'value': 4, 'type_hint': int}},
     'function': <function __main__.add_two(b: int = 10) -> int>}},
   'edges': [('inputs.three__c', 'one.inputs.a'),
    ('one.outputs.result', 'two.inputs.b'),
    ('two.outputs.result', 'outputs.three__w')]},
  'four': {'inputs': {'a': {'value': 4, 'type_hint': int}},
   'outputs': {'result': {'value': 5}},
   'function': <function __main__.add_one(a: int)>}},
 'edges': [('inputs.three__c', 'one.inputs.a'),
  ('three.outputs.w', 'four.inputs.a')]}

@samwaseda samwaseda requested a review from liamhuber February 11, 2025 13:13
Copy link

Binder 👈 Launch a binder notebook on branch pyiron/pyiron_workflow/univ

@samwaseda
Copy link
Member Author

I'm not quite done with the unit tests, but I wanted to have feedback from @liamhuber before I move things around.

@samwaseda samwaseda added the format_black trigger the Black formatting bot label Feb 11, 2025
Copy link

codacy-production bot commented Feb 11, 2025

Coverage summary from Codacy

See diff coverage on Codacy

Coverage variation Diff coverage
+0.05% (target: -1.00%) 96.08%
Coverage variation details
Coverable lines Covered lines Coverage
Common ancestor commit (a3d38f7) 3429 3133 91.37%
Head commit (b280e64) 3474 (+45) 3176 (+43) 91.42% (+0.05%)

Coverage variation is the difference between the coverage for the head and common ancestor commits of the pull request branch: <coverage of head commit> - <coverage of common ancestor commit>

Diff coverage details
Coverable lines Covered lines Diff coverage
Pull request (#586) 51 49 96.08%

Diff coverage is the percentage of lines that are covered by tests out of the coverable lines that the pull request added or modified: <covered lines added or modified>/<coverable lines added or modified> * 100%

See your quality gate settings    Change summary preferences

Codacy stopped sending the deprecated coverage status on June 5th, 2024. Learn more

@samwaseda
Copy link
Member Author

So, mypy is failing and I kind of understand why - I guess it wants to have the same type for all the dict entries, or otherwise at least explicitly have the hint that says what kind of types are expected. I don’t really know how to make it compatible with what mypy expects - @liamhuber @XzzX ?

@samwaseda samwaseda removed the format_black trigger the Black formatting bot label Feb 11, 2025
Copy link
Member

@liamhuber liamhuber left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Top most feedback: cool, good momentum!

Then: we absolutely can't merge such a thing as long as it is unsafely bound to Function nodes. This needs to work universally or not at all.

Finally: I think it is going to be possible to go in and add more branching clauses such that you safely cover our entire space of available nodes. However, in the spirit of "don't do hard things", IMO a better option is to clean up the stuff in pyiron_workflow that is making this a pain in the butt, and then come implement the dictionary abstraction. I linked some existing issues, and in the process of the review wrote two new ones. If we get "Then:" resolved, I'm open to merging, I just think it's faster at the cost of more net work and more net trouble.

@@ -26,6 +27,118 @@
from pyiron_workflow.storage import StorageInterface


def _extract_data(item: Channel, with_values=True, with_default=True) -> dict:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The type hint is wrong as Channel doesn't have value or default, and IMO this method should probably just live directly on the relevant class -- move it over to DataChannel?

return data


def _is_internal_connection(channel: Channel, workflow: Composite, io_: str) -> bool:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

*sigh*. What I would like is all channels are only ever connected to other channels in the same scope. Unfortunately, nodes with no parent at all at least complicate this. Anyhow, this function isn't bad, it's just that it ought to be sufficient to merely use channel.connected. A cleaner solution would be to first go close #587, which I just opened because here you reminded me that we have this problem 😂 then come back here and not need this function at all.

return data


def export_node_to_dict(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will not work. You hint Node, but not all Node have node_function. Hinting Function would be too narrowly scoped, as we do have this possibility for stuff like Transformer.

It is also not a good feeling to have export_node_to_dict in composite.py. The scoping is off. I guess we want to be able to convert nodes to dicts regardless of whether they're composite or not! So I think such tools need to either exist locally next to the thing they're converting, or be collected together in some third location like export.py as you mentioned elsewhere.

I also think that the composite's probably should also get a "node function", but it's something like Composite._on_run, or any other placeholder for the fact that we're going to execute the subgraph and how we'll do that.

Honestly, I think you will have a much easier time with this whole PR if you (or you wait for me eventually to) clean up the space you're working in -- i.e. if #504 is closed, we'll have a much cleaner division of "composite" vs "atomic" nodes right in the class hierarchy. #360 might help too, but it's harder and I think you can get away without it here.

Returns:
dict: The exported composite as a dictionary.
"""
data = {"inputs": {}, "outputs": {}, "nodes": {}, "edges": []}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All composite's are Nodes, yet we don't use export_node_to_dict as a starting point -- this makes me uncomfortable. An immediate result of this is that we wind up with the data dict being re-declared here and we're just relying on future maintainers to make sure all the string keys stay nicely synchronized.

return any(channel.connections[0] in getattr(n, io_) for n in workflow)


def _get_scoped_label(channel: Channel, io_: str) -> str:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def _get_scoped_label(channel: Channel, io_: str) -> str:
def _scoped_label_to_io_label(channel: Channel, io_: str) -> str:

Function name lies!

_get_scoped_label(inp.value_receiver, "inputs"),
)
)
for node in workflow:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is 100% correct, but it reminds me that I meant to open another issue... #588

We could proceed with such a definition now, but we'd need to remember to update it as part of that issue.

for inp in node.inputs:
if _is_internal_connection(inp, workflow, "outputs"):
data["edges"].append(
(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still advocate for {} rather than ()

Comment on lines +140 to +147
for out in node.outputs:
if out.value_receiver is not None:
data["edges"].append(
(
_get_scoped_label(out, "outputs"),
f"outputs.{out.value_receiver.scoped_label}",
)
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is super similar to the same scraping for the input value receiver pairs, just a sort of conjugate. I think it should be possible to nicely pull out a _get_value_synchronization_edges(sender: Node, receiver: Node, io: str) function

@liamhuber
Copy link
Member

So, mypy is failing and I kind of understand why - I guess it wants to have the same type for all the dict entries, or otherwise at least explicitly have the hint that says what kind of types are expected. I don’t really know how to make it compatible with what mypy expects - @liamhuber @XzzX ?

I haven't dug into the error reports, but given the nature of the problem I suspect you're going to need something akin to NestedDictAlias: TypeAlias = dict[str, "str | object | NestedDictAlias"]

@samwaseda samwaseda marked this pull request as draft February 12, 2025 09:27
@samwaseda
Copy link
Member Author

Since this is not the most urgent issue, @liamhuber and I decided to keep it on hold until either @liamhuber signs a contract or I finish the development on the semantikon side.

@coveralls
Copy link

coveralls commented Feb 13, 2025

Pull Request Test Coverage Report for Build 13264438848

Details

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall first build on univ at 91.436%

Totals Coverage Status
Change from base Build 13296826982: 91.4%
Covered Lines: 3171
Relevant Lines: 3468

💛 - Coveralls

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

pyiron_workflow-independent dict containing workflow info
4 participants