test_nn_relu_forward_1() and test_nn_relu_backward_1() both pass non-negative input to ReLU layer. This does not allow to test how layer works for negative inputs.
Inputs get generated as x = get_tensor(*shape) and get_tensor() function samples data uniformly from [0; 5[ interval:
def get_tensor(*shape, entropy=1):
np.random.seed(np.prod(shape) * len(shape) * entropy)
return ndl.Tensor(np.random.randint(0, 100, size=shape) / 20, dtype="float32")