For $m$ around $n/log n$ the probability goes to $0$, with $(0,1)$ or $(-1,1)$, it doesn't matter.

Combinatorial Lemma: Any set of subsets of $\{1,\dots, n\}$ which contains no pair of subsets $A, B$ with $A \subset B$ has size at most $\begin{pmatrix} n \\\lfloor n/2 \rfloor \end{pmatrix}$ elements.

Proof: Consider all paths from the empty set to the full subset that add one element at each time. Each one can intersect the set once, and there are $n!$ such paths. Each subset of size $k$ in the set must intersect $k! (n-k)!$ paths. This is minimized when $k = \lfloor n/2 \rfloor$, giving the upper bound.

Consequence: For a vector $v$ with $c$ nonzero entries, the probability that it is in the kernel of a random matrix is at most

$$ \left( \frac{ \begin{pmatrix} c \\\lfloor c/2 \rfloor \end{pmatrix}}{ 2^c} \right)^m \approx \left( \frac{2}{\sqrt{\pi c} } \right)^m, \leq \left( \frac{1}{2} \right)^{m}$$

Proof: We may ignore the columns corresponding to the entries in the vector that are zero. So we may assume $n=c$. Each row is independent, so the probability is the probability that a single row is orthogonal to the vector, raised to the power $m$. Form a bijection between row vectors and subsets, where, if the entry in $v$ is positive, $1$ corresponds to being in the subset and $0$ or $-1$ does not, and if $v$ is negative, $0$ or $-1$ corresponds to being in the subset and $1$ does not. Then the row dot the vector is a monotonic function, so the set where it is zero contains no two subsets that dominate each other, so has the upper bound from the combinatorial lemma. Dividing it by $2^n$, we obtain the probability, and then raise it to the power $m$.

The asymptotic is well-known, though I may have the constant wrong, and the upper bound is obvious.

Now we split the short vectors into the ones with $<C$ entries and the ones with more than $C$ entries.

There are at most

$$\begin{pmatrix} n \\ C \end{pmatrix} \sqrt{n}^C \approx n^{ (3/2) C}$$

vectors with fewer than $C$ nonzero entries, and each has probability of at most $2^{-m}$ of being in the kernel, so we need:

$$ n^{ (3/2) C} < 2^m $$

There are at most $k^n$ vectors of length $<\sqrt{n}$ for a constant $k$, and each has a probability at most $(2/\sqrt{\pi C})^m$ of being in the kernel, so we need:

$$ k^n < C^{m/2} (\pi/2)^{m/2} $$

Taking $m$ a small constant multiple of $n/\log n$ and $C$ a small constant multiple of $n/(\log n)^2$ does the trick.

The improvement over Igor Rivin's bound comes only from the better estimate for the number of vectors of length $\sqrt{n}$. The proof of this is simple: Consider the sum over $v \in \mathbb Z^n$ vectors of $e^{- |v|^2}$. This is $\left( \sum_{x \in \mathbb Z} e^{-x^2} \right)^n$, and each vector of length at most $\sqrt{n}$ contributes at least $e^{-n}$, giving the desired bound. The complicated combinatorial calculations only serve to make the argument more rigorous.