-1
\$\begingroup\$

Two independent bits oscillate at the same frequency (50% duty cycle) and are connected to an AND gate.

Initially, the first bit holds a constant 0, and the second bit holds a constant 1. At time t=0, both bits guaranteed to be simultaneously begin oscillating, toggling their states (0 ↔ 1) every half-period.

The oscillators of both inputs are independent but share identical frequency, start time, and duty cycle.

Under ideal assumptions (no phase drift, jitter, or frequency mismatch, instantaneous state transitions), does the AND gate output remain guaranteed to be 0 indefinitely despite of using independent oscillators?

Specifically, does the initial phase difference (0 vs. 1) ensure the two bits never overlap at 1 due to their 180° phase shift, or could independent oscillators still desynchronize over time despite identical frequency and simultaneous start?

If guaranteed, what makes the ideal assumptions's answer different with using a shared oscillator (instead of two independent oscillators) for both inputs where one of them is inverted using NOT gate sourced from the shared oscillator (no debate, it's always guaranteed to be synchronized).

Is it also guaranteed in reality? At least, mention a physical medium for creating AND gate that can support the guaranteed condition.

\$\endgroup\$
8
  • \$\begingroup\$ One enemy - that might be the worst - is the travel time of your signal. You need to ensure all Signal lines are the SAME length. This is very hard to achieve. \$\endgroup\$ Commented Apr 12 at 20:31
  • \$\begingroup\$ A sine wave has two (three) degrees of freedom: Frequency & Phase shift (and amplitude). If both are locked (identical) then they won't desynchronize. But in reality this won't be the case. Imagine a temperature gradient between both oscillators: they change their frequency accordingly (in the region of a few 10ppm for quartz based oscillators). After some time the desync is guranteed. \$\endgroup\$ Commented Apr 12 at 20:35
  • \$\begingroup\$ "...then they won't desynchronize". Even, if we are talking about ideal assumption, what makes that ideal answer different if we are using a single shared oscillator for both inputs where one of input is inverted using NOT gate? \$\endgroup\$ Commented Apr 12 at 20:41
  • 4
    \$\begingroup\$ It doesn't make to specify "independent oscillators" and then imagine away everything that makes them independent. In real life, two independent oscillators will never stay in phase indefinitely. They will always drift, and the question is only: how much, how soon? Two oscillators that magically stay in sync forever are not independent. \$\endgroup\$ Commented Apr 12 at 21:13
  • 1
    \$\begingroup\$ What's behind your question here? Is this just the fun of exploring a what-if with a load of this-I-like this-I-don't-like rules behind them? Or is there an actual design situation you're facing that you need a solution on? \$\endgroup\$ Commented Apr 12 at 22:12

1 Answer 1

1
\$\begingroup\$

Two independent real-world oscillators are unlikely to have exactly same frequency.

Even if it was a single oscillator with dual output, the phase shift and duty will be affected by manufacturing tolerances.

Even if you feed exactly same signal to both inputs of an AND gate, the internal circuits and tolerances will modify the phase and duty so it does not look like same signal that was input.

Digital gates and chips are not ideal that work with mathematical 0 and 1 states that toggle at infinite rate. Digital chips are constructed with analog circuitry such as transistors that are non-ideal and have finite speed of turning on and off and thus a signal through a gate will get delayed and a square wave duty gets altered.

\$\endgroup\$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.