![]() ![]() This is the basic building block of tensor-based computation. Since C/C++ use row-major ordering for arrays while Julia follows a column-major ordering. To keep things consistent, we keep the underlying data in their original layout, but use language-native convention when we talk about shapes. for Vim: sudo EDITORvim crontab -e, or for Nano: sudo EDITORnano crontab -e. In case you want to use a particular editor to edit the root crontab, run it like this: sudo EDITOReditor crontab -e, e.g. tanh.(x::NDArray)Defined in src/operator/tensor/elemwise_unary_op_:L234 tan.(x::NDArray)Defined in src/operator/tensor/elemwise_unary_op_:L83 sinh.(x::NDArray)Defined in src/operator/tensor/elemwise_unary_op_:L201 sin.(x::NDArray)Defined in src/operator/tensor/elemwise_unary_op_:L46 reshape(arr::NDArray, dim reverse=false)ĭefined in src/operator/tensor/matrix_op.cc:L165Ĭosh.(x::NDArray)Defined in src/operator/tensor/elemwise_unary_op_:L216 cos.(x::NDArray)Defined in src/operator/tensor/elemwise_unary_op_:L63 For example, a mini-batch of 100 MNIST images is a tensor of C/C++/Python shape (100,1,28,28), while in Julia, the same piece of memory have shape (28,28,1,100). Don't add sudo before command or script, because it runs as root anyway, since it's added to the root crontab. broadcast_axis(x::NDArray, dim, size)īroadcasts the input array over particular axis(axes). julia> xĭefined in src/operator/tensor/broadcast_reduce_op_:L207 Parameter dim and size could be a scalar, a Tuple or an Array.īroadcast_axes is just an alias. broadcast_to(x::NDArray, dims)īroadcasts the input array to a new shape. ![]() Julia> x = mx.ones(2, 3, 4) ĭefined in src/operator/tensor/broadcast_reduce_op_:L231 In the case of broacasting doesn't work out of box, you can expand the NDArray first. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |