Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Trim the features #99

Merged
merged 1 commit into from
Sep 19, 2015
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 10 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,26 +16,18 @@ Contents

Features
--------
* Lightweight: small but sharp knife
- mxnet contains concise implementation of state-of-art deep learning models
- The project maintains a minimum dependency that makes it portable and easy to build
* Scalable and beyond
- The package scales to multiple GPUs already with an easy to use kvstore.
- The same code can be ported to distributed version when the distributed kvstore is ready.
* Multi-GPU NDArray/Tensor API with auto parallelization
- The package supports a flexible ndarray interface that runs on both CPU and GPU, more importantly
automatically parallelize the computation for you.
* To Mix and Maximize
- Mix all flavors of programming models to maximize flexiblity and efficiency.
* Lightweight and scalable
- Minimum build dependency, scales to multi-GPU and ready toward distributed.
* Auto parallelization
- Write serial ndarray GPU programs, and let the engine parallelize it for you.
* Language agnostic
- The package currently support C++ and python, with a clean C API.
- This makes the package being easily portable to other languages and platforms.
- With support for python, c++, more to come.
* Cloud friendly
- MXNet is ready to work with cloud storages including S3, HDFS, AZure for data source and model saving.
- This means you do can put data on S3 directly using it to train your deep model.
* Easy extensibility with no requirement on GPU programming
- The package can be extended in several scopes, including python, c++.
- In all these levels, developers can write numpy style expressions, either via python
or [mshadow expression template](https://github.com/dmlc/mshadow).
- It brings concise and readable code, with performance matching hand crafted kernels
- Directly load/save from S3, HDFS, AZure
* Easy extensibility
- Extending no requirement on GPU programming.

Bug Reporting
-------------
Expand Down
2 changes: 1 addition & 1 deletion dmlc-core
8 changes: 4 additions & 4 deletions doc/developer-guide/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Overview of the Design
----------------------
* [Execution Engine](engine.md)

List of Resources
-----------------
* [Doxygen Version of C++ API](https://mxnet.readthedocs.org/en/latest/doxygen)
* [Contributor Guide](../contribute.md)
List of Other Resources
-----------------------
* [Doxygen Version of C++ API](https://mxnet.readthedocs.org/en/latest/doxygen) gives a comprehensive document of C++ API.
* [Contributor Guide](../contribute.md) gives guidelines on how to push changes to the project.
17 changes: 17 additions & 0 deletions example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
MXNet Examples
==============
This folder contains examples of MXNet.

Contents
--------
* [mnist](mnist) gives examples on training mnist.
* [cifar10](cifar10) gives examples on CIFAR10 dataset.


Python Howto
------------
[Python Howto](python-howto) is a folder containing short snippet of code
introducing a certain feature of mxnet.

***List of Examples***
* [Configuring Net to get Multiple Ouputs](python-howto/multiple_outputs.py)
21 changes: 21 additions & 0 deletions example/python-howto/multiple_outputs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
"""Create a Multiple output configuration.

This example shows how to create a multiple output configuration.
"""
import mxnet as mx

net = mx.symbol.Variable('data')
fc1 = mx.symbol.FullyConnected(data=net, name='fc1', num_hidden=128)
net = mx.symbol.Activation(data=fc1, name='relu1', act_type="relu")
net = mx.symbol.FullyConnected(data=net, name='fc2', num_hidden=64)
out = mx.symbol.Softmax(data=net, name='softmax')
# group fc1 and out together
group = mx.symbol.Group([fc1, out])
print group.list_outputs()

# You can go ahead and bind on the group
# executor = group.simple_bind(data=data_shape)
# executor.forward()
# executor.output[0] will be value of fc1
# executor.output[1] will be value of softmax

53 changes: 31 additions & 22 deletions python/mxnet/symbol.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
# coding: utf-8
# pylint: disable=invalid-name, protected-access, fixme, too-many-arguments
"""Symbol support of mxnet"""
"""Symbolic support of mxnet.

Symbolic API of MXNet

"""
from __future__ import absolute_import

import ctypes
Expand Down Expand Up @@ -341,10 +345,10 @@ def _get_ndarray_handle(arg_key, args, arg_names, allow_missing):
arg_key : str
The name of argument, used for error message.

args : list of NDArray or dict of str->NDArray
args : list of NDArray or dict of str to NDArray
Input arguments to the symbols.
If type is list of NDArray, the position is in the same order of arg_names.
If type is dict of str->NDArray, then it maps the name of arguments
If type is dict of str to NDArray, then it maps the name of arguments
to the corresponding NDArray,

args_names : list of string
Expand All @@ -366,22 +370,22 @@ def _get_ndarray_handle(arg_key, args, arg_names, allow_missing):
raise ValueError('Length of %s do not match number of arguments' % arg_key)
for narr in args:
if not isinstance(narr, NDArray):
raise TypeError('Only Accept list of NDArrays or dict of str->NDArray')
raise TypeError('Only Accept list of NDArrays or dict of str to NDArray')
arg_handles.append(narr.handle)
elif isinstance(args, dict):
for name in arg_names:
if name in arg_names:
narr = args[name]
if not isinstance(narr, NDArray):
raise TypeError('Only Accept list of NDArrays or dict of str->NDArray')
raise TypeError('Only Accept list of NDArrays or dict of str to NDArray')
arg_handles.append(narr.handle)
else:
if allow_missing:
arg_handles.append(None)
else:
raise ValueError('Must specify all the arguments in %s' % arg_key)
else:
raise TypeError('Only Accept list of NDArrays or dict of str->NDArray')
raise TypeError('Only Accept list of NDArrays or dict of str to NDArray')
return c_array(NDArrayHandle, arg_handles)

def simple_bind(self, ctx, grad_req='write', **kwargs):
Expand All @@ -396,12 +400,12 @@ def simple_bind(self, ctx, grad_req='write', **kwargs):
ctx : Context
The device context the generated executor to run on.
grad_req: string
{'write', 'add', 'null'}, or list of str or dict of str->str, optional
{'write', 'add', 'null'}, or list of str or dict of str to str, optional
Specifies how we should update the gradient to the args_grad.
- 'write' means everytime gradient is write to specified args_grad NDArray.
- 'add' means everytime gradient is add to the specified NDArray.
- 'null' means no action is taken, the gradient may not be calculated.
kwargs : dict of str->NDArray
kwargs : dict of str to NDArray

Returns
-------
Expand Down Expand Up @@ -436,34 +440,38 @@ def bind(self, ctx, args, args_grad=None, grad_req='write', aux_states=None):
ctx : Context
The device context the generated executor to run on.

args : list of NDArray or dict of str->NDArray
args : list of NDArray or dict of str to NDArray
Input arguments to the symbol.

- If type is list of NDArray, the position is in the same order of list_arguments.
- If type is dict of str->NDArray, then it maps the name of arguments
to the corresponding NDArray,
- If type is dict of str to NDArray, then it maps the name of arguments
to the corresponding NDArray.
- In either case, all the arguments must be provided.

args_grad : list of NDArray or dict of str->NDArray, optional
args_grad : list of NDArray or dict of str to NDArray, optional
When specified, args_grad provide NDArrays to hold
the result of gradient value in backward.

- If type is list of NDArray, the position is in the same order of list_arguments.
- If type is dict of str->NDArray, then it maps the name of arguments
- If type is dict of str to NDArray, then it maps the name of arguments
to the corresponding NDArray.
- When the type is dict of str->NDArray, users only need to provide the dict
- When the type is dict of str to NDArray, users only need to provide the dict
for needed argument gradient.
Only the specified argument gradient will be calculated.

grad_req : {'write', 'add', 'null'}, or list of str or dict of str->str, optional
grad_req : {'write', 'add', 'null'}, or list of str or dict of str to str, optional
Specifies how we should update the gradient to the args_grad.

- 'write' means everytime gradient is write to specified args_grad NDArray.
- 'add' means everytime gradient is add to the specified NDArray.
- 'null' means no action is taken, the gradient may not be calculated.

aux_states : list of NDArray, or dict of str->NDArray, optional
aux_states : list of NDArray, or dict of str to NDArray, optional
Input auxiliary states to the symbol, only need to specify when
list_auxiliary_states is not empty.

- If type is list of NDArray, the position is in the same order of list_auxiliary_states
- If type is dict of str->NDArray, then it maps the name of auxiliary_states
- If type is dict of str to NDArray, then it maps the name of auxiliary_states
to the corresponding NDArray,
- In either case, all the auxiliary_states need to be provided.

Expand Down Expand Up @@ -579,7 +587,7 @@ def Variable(name):


def Group(symbols):
"""Create a symbolic variable that groups several symbols together.
"""Create a symbol that groups symbols together.

Parameters
----------
Expand Down Expand Up @@ -613,10 +621,11 @@ def load(fname):
Parameters
----------
fname : str
The name of the file
- s3://my-bucket/path/my-s3-symbol
- hdfs://my-bucket/path/my-hdfs-symbol
- /path-to/my-local-symbol
The name of the file, examples:

- `s3://my-bucket/path/my-s3-symbol`
- `hdfs://my-bucket/path/my-hdfs-symbol`
- `/path-to/my-local-symbol`

Returns
-------
Expand Down