^ It depends on where in the sky you're shooting. At the equator, with an image closer to one of the poles of the earth, the stars are moving MUCH MUCH slower in the sky than if you shoot straight up or right toward the east or west horizons. And as your position moves more extremely north or south, this changes of course. Near the north pole, the stars straight up will be moving slowest, and the stars on any horizon will be moving fastest.
We're talking about easily 1 maybe 2 orders of magnitude here. So depending on which part of the sky you shoot, 10 seconds might be needed to show a trail, whereas somewhere else, 2-3 minutes might be needed to show a trail.
In fact, you can calculate it exactly. Look up the travel distance of a star of interest in your area, see how much it moves during one night, then divide out the number of hours it was visible for to get the star's speed. Compare to the distance shown in your viewfinder to figure out speed in terms of pixels/second. Then you can determine if blur would be visible or not.
< about 1-2 pixels over the course of your exposure = no noticeable trails.
> 1-2 pixels = noticeable trails.
For a very wide angle shot, calculate several stars at the edges and center of the frame and take the fastest moving one as the limiting factor.
Annoying how astronomy matters for astrophotography, isn't it? But anyway, you could both be right.