-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add DataFeeder #6102
Add DataFeeder #6102
Conversation
A v2 API like data feeder for book demos. We can feed data directly from reader.
|
||
|
||
class DataToLoDTensorConverter(object): | ||
def __init__(self, place, lod_level, shape, batch_size_dim, dtype): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems that batch_size_dim
here is no use.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. batch_size_dim could be removed.
elif dtype == core.DataType.FP64: | ||
self.dtype = 'float64' | ||
elif dtype == core.DataType.INT32: | ||
self.dtype = 'int32' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
else:
throw exception
arr = numpy.array(self.data, dtype=self.dtype).reshape(self.shape) | ||
t = core.LoDTensor() | ||
t.set(arr, self.place) | ||
if self.lod_level != 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.lod_level > 0
self.feed_names.append(each_var.name) | ||
shape = each_var.shape | ||
batch_size_dim = -1 | ||
for i, s in enumerate(shape): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is not batch_size_dim always in the first dimension?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, batch_size_dim is not the first dimension if we use static RNN
converter = [] | ||
for lod_level, shape, dtype in six.zip( | ||
self.feed_lod_level, self.feed_shapes, self.feed_dtypes): | ||
batch_size_dim, shape = shape |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line take batch_size_dim as the first dimension.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. shape is a tuple before.
Also add __all__ to data_feeder.py
6508c83
to
2d520eb
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent work
A v2 API like data feeder for book demos.
We can feed data directly from reader.
Fix #6101