Introduction Signals, Noise, Energy and Linearity by Erol Seke For the course “ Communications ” OSMANGAZI UNIVERSITY
The Goal Transfer information from source point to destination correctly (and using least amount of resources, in most cases) Information Channel Information Information Generator User Source point Destination point Noise Sources
An Example Communication System X=y Sound Amplifier Some other examples of (electronic) source, channel and destination Microphone – Twisted pair of wire - Amplifier Modem - Twisted pair of telephone line - Modem Fax scanner – Telephone system – Fax printer Computer – Ethernet cable - Computer Computer’s data storage medium – Fiber optic network – Another computer’s storage Digital data generator – Magnetic disk – Digital data user Radio transmitter – Air – Radio receiver Digital TV data from satellite – Atmosphere – Digital TV receiver TV Remote controller – Air – IR sensor/receiver on TV
An Example Digital Communication System There are only 4 messages/values to sent ( Analog system had infinite number of values )
Analog / Digital Electronic Communication analog or digital circuits Channel Transmitter Receiver Noise Transmitted Signal (infinite number of possible values) Analog Communication Signal’s all values are important at every point and cannot be completely repaired when damaged Digital Communication Finite number of symbols represented T s by finite number of waveforms within T s
Digital Communication Advantages : - Mathematical/Logical Processing on the data is possible - Therefore : higher protection against noise - More flexible when performed using reconfigurable / reprogrammable elements - ? Disadvantages : - Complexity is higher - Higher speed devices are required - Analog signals need to be converted/deconverted using ADC/DAC - ? against analog communication
An Advantage of Digital Communication Signal can be restored and resent halfway between transmitter and receiver Channel ( ) r t Repeater 1 ( ) 2 ( ) N t N t
An Advantage of Digital Communication So that the signal is received with minimum (or no) error
General Communication System
Various Signals ( ) t triangular pulse t ( ) y t rectangular periodic signal t T square wave ( ) t random signal t noise ( ) x t discrete semi-periodic t BPSK samples
Energy of a Signal Energy of a signal x ( t ) is defined as the energy spent on a 1 Ohm load 2 ( ) E x t dt x Its unit is Volts 2 /Ohm = Joules
Example Energy of a rectangular pulse ( ) x t A t t T t T 0 0 t T 0 2 t T 2 2 2 0 ( ) A dt A t A T E x t dt x t 0 t 0 independent of the position on time axis (valid for all signals)
Example 1 AT 1 A Let T ( ) x t 1 T t t T t T 0 0 ( ) x t 1 T t T t t T 0 0 ( ) x t lim T 1 unit impulse T 0 t t 0
Power of a Signal If the energy is infinite, then we talk about the energy spent in unit time. /2 T 2 lim 1 ( ) P x t dt x T T /2 T For periodic signals T 0 2 1 ( ) P x t dt x T Its unit is Watts
Example Find the power of the sawtooth signal ( ) s t A ... ... t T T 2 2 T At dt 1 1 A t T 2 3 ( ) P s t dt s 3 3 T T T T 0 0 0 2 A P power is independent of the period / frequency s 3
Example ( ) cos( 8 ) y t t Find the power and energy of the waveform 1 a T 2 ( ) P x t dt T a 1 / 16 1 2 cos ( 8 ) P t dt 1 8 1 / 16 1 / 16 t sin( 16 ) 1 t 8 P (verify!) 2 32 2 1 / 16 1 E Since it is a power signal. Therefore it is not an energy signal. So, 2
Average / Expected Values /2 T lim 1 ( ) m x t dt Average value of a continuous signal x T T /2 T N 1 x x Average value of discrete samples avg i N 1 i x X is a random process, is its generated values ( we will define random process later ) { } ( ) E X tx t dt Expected value of a random process Average and expected values are equal when observation duration (or number of samples) is infinite
Example ( ) cos(2 / ) y t t T ( ) sin(2 / ) x t t T and 2 2 ( ) E y t dt ( ) E x t dt y x energies are both infinite (periodic signals) T T 1 0 2 2 1 sin(2 / ) 1 ( ) t T dt P x t dt T x T 2 0 T T 0 1 2 2 1 1 ( ) cos(2 / ) P y t dt t T dt y T T 2 0 powers are the same /2 T m lim 1 sin(2 / ) 0 averages are the same m t T dt x T y T /2 T question : what is different? They are both sinusoidals
Example Similarity / Dissimilarity Measure Similarity of signals is measured using an inner product ( ), ( ) ( ) ( ) y t x t y t x t dt ( ), ( ) y t x t t It is obvious that the integral, except for finite duration signals, will be infinite. Therefore, we need to have some kind of normalization. T ( ), ( ) sin(2 / )cos(2 / ) 0! x t y t t T t T dt 0 Does this mean these two are dissimilar? We need to check similarity for shifted versions of signals too.
Example Cross-Correlation Similarity of shifted versions of signals ( ) ( ) ( ) R x t y t dt xy called Cross-Correlation, where τ represents the time shift T ( ) ( ) ( ) R x t y t dt for periodic signals xy 0 ( ) ( )/ R R R ( normalized ) xy xy max ( ), ( ) y t x t t for τ =0 for τ =T integral interval
Example T T T Since our signals are periodic, x y we can select an integral interval of T T 2 sin(2 / ) T T 0 sin(2 / )cos(2 ( )/ ) R t T t T dt xy ( ) R xy 1 T 2 T 2 The signals are similar to each other on periodic intervals
Autocorrelation ( ) ( ) y t x t if both signals are same; the similarity is named as autocorrelation ( ) ( ) ( ) R x t x t dt xy Example ( ) R xy T 2 T 2 signal x is similar to itself on periodic intervals
Example 1 , 0 t T ( ) ( ) x t r t and ( ) r t 0 , otherwise ( ) ( ) ( ) x t x t x t 0 0 0 1 t t t T ( ) ( ) r t r t ( ) r t t t t T T T T (0) R dt T ( ) ( ) R dt T R dt T xy 0 xy xy 0 R xy T T T
Example , 0 t t T ( ) ( ) x t r t and ( ) r t 0 , otherwise ( ) ( ) ( ) x t x t x t 0 0 0 t t t T ( ) ( ) r t r t ( ) r t t t t T T 2 (0) R t dt T xy T ( ) ( ) R t t dt 0 ( ) ( ) R t t dt xy xy 0 R xy 3 T 3 T T
Orthogonal Signals Inner product also tells us if the signals are orthogonal ( ), ( ) ( ) ( ) 0 y t x t y t x t dt If ( ), ( ) y t x t are orthogonal then ( ), ( ) y t x t t T ( ), ( ) sin(2 / )cos(2 / ) 0 x t y t t T t T dt 0 does not have any component of ( ) ( ) y t x t meaning that within (shifted versions of signals may not be orthogonal)
Example ( ) x t 1 t T 2 ( ) y t 1 t T T 2 T ( ), ( ) ( ( ) ( T ))( ( T ) ( )) 0 x t y t u t u t u t u t T dt 2 2 0 ( ) ( ) x t k t Given a set of waveforms , we can find an orthogonal waveform set i ( ) ( ) x t k t so that can be written as a weighted linear sum of i M 1 ( ) ( ) x t c t i k i , k 0 k Hmw : study this subject ( orthogonalization ) from the referenced sources
Probability Die throwing experiment p i 1/6 x 1 2 3 4 5 6 6 1 p i 1 i Random variable : An event or value which is measured x : Value read on die after throwing event (random variable) p i : probability of x ( p ( x=i ) )
Expected Value of Discrete Experiment ( ) ( ) E X x p x i i i The name "expected value" does not imply that it is expected to happen Expected value of die throwing experiment = 3.5 which will never happen p i 1/6 x 1 2 3 4 5 6 This graph is called Probability Mass Function (pmf)
Recommend
More recommend