Time meaningfully progresses every 1/60th of a second, when a new frame is sent to the monitor. If your video card is fast enough to render 240fps for game X, then it will render and DISCARD about 3 or so frames before the next monitor refresh. Meaning that regardless of buffer the next frame WILL display your input. If your video card is vsynced that it would have instead rendered 3 "future" frames that are each 1/60th of a second apart and EACH is going to be displayed, resulting in X/60th of a second input lag.
vsync off (250fps, frames 1/250th of a second apart):
time=0 : Frame 1 is displayed.
time=1/250s: Frame 2 created.
time=1.5/250s: input from user
time=2/250s: Frame 3 created.
time=3/250s: Frame 4 created.
time=1/60s: Frame 4 begins sending to monitor
time=4/250s: Frame 5 enters buffer while frame 4 is being sent. Resulting in tearing as the top half of frame 4 and bottom half of frame 5 are displayed.
vsync on (250fps CAPABLE card working at 60fps) :
time=0 : Frame 1 is displayed.
time=1/250s: Frame 2 created
time=1.5/250s: input from user
time=2/250s: Frame 3 created.
time=3/250s: Frame 4 created.
time=1/60s: Frame 2 (which is missing the last input) begins sending to monitor, when it finishes, frame 5 will begin rendering.
Basically with very high FPS situation, input lag will be introduced by triple and double buffering. (2/60th and 1/60th of a second respectively). But the tearing is eliminated. With low FPS the input lag is lessened because it is less like that frames are rendered ahead (since the video card is just not fast enough), but it might still occur in times of high FPS spikes. However tearing is completely gone.