You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please make sure that this is a bug. As per our GitHub Policy,
we only address code/doc bugs, performance issues, feature requests and
build/installation issues on GitHub. tag:bug_template
System information
Have I written custom code (as opposed to using a stock example script provided in TensorFlow.js): Yes
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Fedora 40
Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
TensorFlow.js installed from (npm or script link): pnpm
TensorFlow.js version (use command below): 4.22.0
Browser version: NA
Tensorflow.js Converter Version: NA
Describe the current behavior
Load the model from the attached files
Dispose the model
Result: 36 weights are not disposed. Those weights are all part of a subgraph of which there are 4 copies in the model:
conv2d_Conv2D[7-16]/kernel
conv2d_Conv2D[7-16]/bias
board_analysis_output/kernel
board_analysis_output/bias
Describe the expected behavior
All weights should be disposed.
Standalone code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate
the problem. If possible, please share a link to Colab/CodePen/any notebook. model.zip
import * as tf from "@tensorflow/tfjs-node-gpu";
/** See {@link tfRuntime.broadcastTo} */
class BroadcastLayer extends tf.layers.Layer {
constructor(args) {
super(args);
this.shape = args.shape;
}
computeOutputShape(inputShape) {
return this.shape.map((value, index) => {
if (value == null) {
return inputShape[index];
}
else {
return value;
}
});
}
call(inputs) {
return tf.tidy(() => {
const derivedInput = this.getSingleTensor(inputs);
const inputShape = derivedInput.shape;
const nonNullShape = this.shape.map((value, index) => {
if (value == null) {
return inputShape[index];
}
else {
return value;
}
});
return derivedInput.broadcastTo([...nonNullShape]);
});
}
getSingleTensor(input) {
if (input instanceof tf.Tensor) {
return input;
}
else if (Array.isArray(input) && input.length == 1) {
return input[0];
}
else {
throw new Error(`Expected one tensor but received ${input.length}`);
}
}
getConfig() {
const config = {
shape: [...this.shape],
};
const baseConfig = super.getConfig();
Object.assign(config, baseConfig);
return config;
}
static fromConfig(cls, config) {
return new cls(config);
}
}
/** @nocollapse */
BroadcastLayer.className = "BroadcastLayer";
tf.serialization.registerClass(BroadcastLayer);
/** See {@link tf.Tensor.expandDims} */
class ExpandDimsLayer extends tf.layers.Layer {
constructor(args) {
super(args);
this.dimensionIndices = args.shape;
}
computeOutputShape(inputShape) {
let result = [...inputShape];
for (const index of this.dimensionIndices) {
result.splice(index, 0, 1);
}
return result;
}
call(inputs) {
return tf.tidy(() => {
let result = this.getSingleTensor(inputs);
for (const index of this.dimensionIndices) {
result = result.expandDims(index);
}
return result;
});
}
getSingleTensor(input) {
if (input instanceof tf.Tensor) {
return input;
}
else if (Array.isArray(input) && input.length == 1) {
return input[0];
}
else {
throw new Error(`Expected one tensor but received ${input.length}`);
}
}
getConfig() {
const config = {
shape: [...this.dimensionIndices],
};
const baseConfig = super.getConfig();
Object.assign(config, baseConfig);
return config;
}
static fromConfig(cls, config) {
return new cls(config);
}
}
ExpandDimsLayer.className = "ExpandDimsLayer";
tf.serialization.registerClass(ExpandDimsLayer);
const model = await tf.loadLayersModel(`file://${process.env.HOME}/model/model.json`);
model.dispose();
console.log(JSON.stringify(model.weights, undefined, 2));
Other info / logs Include any logs or source code that would be helpful to
diagnose the problem. If including tracebacks, please include the full
traceback. Large logs and files should be attached.
The text was updated successfully, but these errors were encountered:
If that's really the intent, it should be added to the documentation (unless I missed it there), or else no one will know to do that.
But speaking of the intent, IMO model.dispose should dispose these weights. I guess the current logic of model.dispose is to only dispose weights whose reference count is one? If so, it should do that in a loop, until one iteration is unable to dispose any weights, since one loop iteration may cause additional weights to become disposable in the next iteration. Otherwise to be correct, every user of the API should do exactly that, and I don't think it's possible currently since the reference count isn't exposed. Other approaches like disposing every weight of the model would be incorrect when weights are shared across models.
Consider a scenario where an app loads and uses models with unknown structure. How should it know which are the shared weights it needs to dispose manually?
Please make sure that this is a bug. As per our
GitHub Policy,
we only address code/doc bugs, performance issues, feature requests and
build/installation issues on GitHub. tag:bug_template
System information
Describe the current behavior
Result: 36 weights are not disposed. Those weights are all part of a subgraph of which there are 4 copies in the model:
conv2d_Conv2D[7-16]/kernel
conv2d_Conv2D[7-16]/bias
board_analysis_output/kernel
board_analysis_output/bias
Describe the expected behavior
All weights should be disposed.
Standalone code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate
the problem. If possible, please share a link to Colab/CodePen/any notebook.
model.zip
Other info / logs Include any logs or source code that would be helpful to
diagnose the problem. If including tracebacks, please include the full
traceback. Large logs and files should be attached.
The text was updated successfully, but these errors were encountered: