Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix fetch record from master failed #2848

Merged
merged 3 commits into from
Jul 17, 2017
Merged

Conversation

Yancey1989
Copy link
Contributor

@Yancey1989 Yancey1989 commented Jul 13, 2017

Test code:

  1. Convert dataset uci_housing to RecordIO files.
    # convert.py
    import paddle.v2.dataset as dataset
    dataset.common.convert(
                   output_path="./dataset",
                   reader = dataset.uci_housing.train(),
                   num_shards = 3,
                   name_prefix = "uci_housing-")
  2. Fetch record with python code
    # train.py
    import paddle.v2.master as master
    import paddle.v2.dataset as dataset
    import time
    import cPickle as pickle
    cli = master.client("localhost:8080", 1)
    cli.set_dataset(["/work/dataset/uci_housing--00000-of-00002", "/work/dataset/uci_housing-    -00001-of-00002", "/work/dataset/uci_housing--00002-of-00002"])
    while 1:
        (r, e) = cli.next_record()
        print pickle.loads(r)
        time.sleep(2)

And then the script will print the record correctly:

$python train.py
['/work/dataset/uci_housing--00000-of-00002', '/work/dataset/uci_housing--00001-of-00002', '/work/dataset/uci_housing--00002-of-00002']
(array([-0.03933099, -0.11363636,  0.10092454,  0.93083004, -0.00966062,
        0.01693152,  0.24536662, -0.0392604 , -0.19780031, -0.25236098,
       -0.21867379,  0.09346404, -0.05941124]), array([ 23.]))
(array([-0.03903572, -0.11363636, -0.02004321, -0.06916996, -0.13517502,
        0.01731474, -0.37358292,  0.0136727 , -0.24127857, -0.25045258,
        0.01536877,  0.07346807, -0.09031631]), array([ 28.1]))

typhoonzero
typhoonzero previously approved these changes Jul 13, 2017
Copy link
Contributor

@typhoonzero typhoonzero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@@ -201,8 +201,7 @@ def close_writers(w):
def write_data(w, lines):
random.shuffle(lines)
for i, d in enumerate(lines):
d = pickle.dumps(d, pickle.HIGHEST_PROTOCOL)
w[i % num_shards].write(d)
w[i % num_shards].write(str(d))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it hurt the speed if we remove the Pickle module?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am curious as well, why this change could fix the failure?

Copy link
Contributor

@gongweibao gongweibao Jul 14, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

recordio模块是不是没有更新?这个错误在我写convert的时候已经碰到过了,helin也已经fix掉了。
去掉Pickle解决这个问题,有点奇怪。。。

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curious why we need pickle? since the master is parsing recordio files directly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The same to @typhoonzero , the master need recordio files, and the convert function write a file with pickle.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

用户写入的可能是一个object,比如list,用str是无法完成序列化的吧?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks all! I got it and fixed.
Done.

helinwang
helinwang previously approved these changes Jul 13, 2017
Copy link
Contributor

@helinwang helinwang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Two minor comments.

@@ -38,7 +39,8 @@ func (c *Client) getRecords() {
if err != nil {
// TODO(helin): wait before move on with next
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you remove the TODO? since it's already done.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@@ -201,8 +201,7 @@ def close_writers(w):
def write_data(w, lines):
random.shuffle(lines)
for i, d in enumerate(lines):
d = pickle.dumps(d, pickle.HIGHEST_PROTOCOL)
w[i % num_shards].write(d)
w[i % num_shards].write(str(d))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am curious as well, why this change could fix the failure?

@@ -201,8 +201,7 @@ def close_writers(w):
def write_data(w, lines):
random.shuffle(lines)
for i, d in enumerate(lines):
d = pickle.dumps(d, pickle.HIGHEST_PROTOCOL)
w[i % num_shards].write(d)
w[i % num_shards].write(str(d))
Copy link
Contributor

@gongweibao gongweibao Jul 14, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

recordio模块是不是没有更新?这个错误在我写convert的时候已经碰到过了,helin也已经fix掉了。
去掉Pickle解决这个问题,有点奇怪。。。

@gongweibao
Copy link
Contributor

@helinwang
Copy link
Contributor

Agree with @gongweibao on #2848 (comment) , good catch!

@typhoonzero
Copy link
Contributor

的确,dataset的reader读取出来的都是tuple,record io对应的reader应该是这样:

def cloud_reader():
    master_client = master.client(etcd_endpoint, 5, 64)
    master_client.set_dataset(
        ["/pfs/dlnel/public/dataset/uci_housing/uci_housing-*-of-*"])
    while 1:
        r, e = master_client.next_record()
        if not r:
            break
        yield pickle.loads(r)

@typhoonzero
Copy link
Contributor

typhoonzero commented Jul 14, 2017

When use pickle.loads(r) reporting:

Traceback (most recent call last):
  File "tt.py", line 22, in <module>
    for d in cloud_reader():
  File "tt.py", line 20, in cloud_reader
    yield pickle.loads(r)
EOFError

Fixed and following #2848 (comment) please :)
--Yancey1989

@@ -201,8 +201,10 @@ def close_writers(w):
def write_data(w, lines):
random.shuffle(lines)
for i, d in enumerate(lines):
d = pickle.dumps(d, pickle.HIGHEST_PROTOCOL)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I modify the protocol version from HIGHEST_PROTOCOL to 0, because this will cause an EOFError:

Traceback (most recent call last):
  File "decode.py", line 13, in <module>
    print pickle.loads(o)
EOFError

I think this error maybe caused by RecordIO library: Use binary stream as input will caused an EOFError, the following is my test code, it will throw an EOFError

w = recordio.writer("tmp-recordio")
for i in xrange(10):
    w.write(pickle.dumps(i, pickle.HIGHEST_PROTOCOL))
w.close()

r = recordio.reader("tmp-recordio")
for i in xrange(10):
    o = r.read()
    print pickle.loads(o)
r.close()
=========
Traceback (most recent call last):
  File "decode.py", line 13, in <module>
    print pickle.loads(o)
EOFError

I will follow this problem, and maybe do not block this fix:)

Some documents:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://github.com/PaddlePaddle/recordio/blob/master/python/recordio/recordio_test.py#L9
这个单测确实没有设置协议。
不过0的编码后文件比较大,不如用pickle.HIGHEST_PROTOCOL或者负数。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

参考上面的comment,如果用pickle.HIGHEST_PROTOCOL 会导致RecordIO读取数据失败,感觉是RecordIO的library对二进制的输入支持的有些问题,我还在跟进,会在其他PR中修复这个问题。

Copy link
Contributor

@typhoonzero typhoonzero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Contributor

@gongweibao gongweibao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Yancey1989 Yancey1989 merged commit 83f263e into PaddlePaddle:develop Jul 17, 2017
@Yancey1989 Yancey1989 deleted the bugfix branch July 17, 2017 03:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants