SlideShare uma empresa Scribd logo
1 de 102
Baixar para ler offline
Magnetoencephalography
Preprocessing and Noise
 Reduction Techniques
         Eliezer Kanal
          2/20/2012
       MEG Basics Course




                           1
About Me
• 2005 -! 2009!!!
  ! !       !
                    !
                    !
                        University of Pittsburgh
                        PhD, Bioengineering


• 2009 -! 2011!!!
  ! !       !
                    !
                    !
                        Carnegie Mellon University
                        Postdoctoral fellow, CNBC


• 2011 -! current! !!
  ! !       ! !
                        PNC Financial Services
                        Quantitative Analyst, Risk Analytics




                                                               2
Dealing with Noisy Data
• Overview of MEG Noise
• Noise Reduction
 -   Averaging, thresholding, frequency filters

 -   SSP

 -   SSS/tSSS

• Source Extraction
 -   PCA

 -   ICA




                                                 3
MEG Noise




            4
Breathing




            5
Breathing




6
Frequency




7
Frequency




8
Time-Frequency




9
Biological Noise




                   Vigário, Jousmäki, Hämäläinen, Hari, & Oja (1997)


                                                                       10
Line Noise
             50 Hz Line Noise
             (60 Hz in USA)




              Subject




              Empty Room




                                11
Bad Channels


 Find the bad one:




                     12
Bad Channels


 Find the bad one:




                     12
Noise from nearby construction




                                 13
Noise Reduction
      Techniques

• Averaging, thresholding, frequency filters
• SSP
• SSS/tSSS

                                              14
Averaging
• Removes non-timelocked noise
• Requires:
 -   Time-locked block paradigm design

 -   Temporal or low-frequency analyses




                                          15
Thresholding
• Discarding trials/channels with maximum signal
  intensity greater than some user-defined value

• Removes most “data blips”
• Rudimentary, better technique is to simply examine
  each trial/channel




                                                       16
Frequency Filter
         Filter              Removes…
       High-pass          Lower frequencies
       Low-pass           Higher frequencies
       Band-pass        Outside specified band
        Notch            All except specified


• Very good first step, remove data you won’t analyze
  (don’t waste time cleaning what you won’t examine)

• Use more advanced techniques for specific noise signals
                                                           17
18
19
Signal Space Projection




                          20
Signal Space Projection
• Overview: SSP uses the difference between source
  orientations and locations to differentiate distinct
  sources.

• Theory: Since the field pattern from a single source is
 1) unique
 2) time-invariant,
  we can differentiate sources by examining the angle
  between their “signal space representations”, and
  project noise signals out of the dataset.




                                                           21
22
23
Signal Space Projection
• In general,
                M
                X
      m(t) =          ai (t)si + n(t)
                i=1




                                        24
Signal Space Projection
    • In general,
                    M
                    X
           m(t) =         ai (t)si + n(t)
measured            i=1
 signal




                                            24
Signal Space Projection
    • In general,                           source i
                    M
                    X                                  M = Total number of channels
           m(t) =         ai (t)si + n(t)
measured            i=1
 signal




                                                                                      24
Signal Space Projection
                             source
    • In general,           amplitude
                                            source i
                    M
                    X                                  M = Total number of channels
           m(t) =         ai (t)si + n(t)
measured            i=1
 signal




                                                                                      24
Signal Space Projection
                             source
    • In general,           amplitude
                                            source i
                    M
                    X                                  M = Total number of channels
           m(t) =         ai (t)si + n(t)    noise
measured            i=1
 signal




                                                                                      24
Signal Space Projection
                               source
    • In general,             amplitude
                                              source i
                      M
                      X                                  M = Total number of channels
             m(t) =         ai (t)si + n(t)    noise
measured              i=1
 signal

    • SSP states that s can be split in two:
      -    s‖ ! = signals from known sources

      -    s⟂ ! = signals from unknown sources
              s k = Pk m
             s ? = P? m




                                                                                        24
Signal Space Projection
                                  source
      • In general,              amplitude
                                                 source i
                         M
                         X                                  M = Total number of channels
                m(t) =         ai (t)si + n(t)    noise
  measured               i=1
   signal

      • SSP states that s can be split in two:
          -   s‖ ! = signals from known sources

          -   s⟂ ! = signals from unknown sources
 known           s k = Pk m
sources                             MEG signal
                s ? = P? m
unknown
sources
                  Projection
                  operators


                                                                                           24
Signal Space Projection
                                  source
      • In general,              amplitude
                                                  source i
                         M
                         X                                   M = Total number of channels
                m(t) =         ai (t)si + n(t)      noise
  measured               i=1
   signal

      • SSP states that s can be split in two:
          -   s‖ ! = signals from known sources

          -   s⟂ ! = signals from unknown sources
 known           s k = Pk m
sources                             MEG signal
                s ? = P? m
unknown
sources
                  Projection                     Worth mentioning that sk + s? = s
                  operators


                                                                                            24
Signal Space Projection
How find P‖ and P⟂?




                             25
Signal Space Projection
           How find P‖ and P⟂?

       • Ingenious application of the magic
                                          1 technique of
           Singular Value Decomposition (SVD)




1   Not really magic




                                                           25
Signal Space Projection
           How find P‖ and P⟂?

       • Ingenious application of the magic technique of
                                                  1

           Singular Value Decomposition (SVD)
                                            a matrix of all known sources

       • Let K = {s , s , . . . , s } 2 s . Using SVD, we find a basis
                       1   2     k      k
           for s‖, and therefore P‖.2


1   Not really magic




                                                                            25
Signal Space Projection
           How find P‖ and P⟂?

       • Ingenious application of the magic technique of
                                                     1

           Singular Value Decomposition (SVD)
                                               a matrix of all known sources

       • Let K = {s , s , . . . , s } 2 s . Using SVD, we find a basis
                        1   2       k      k
           for s‖, and therefore P‖.2


1   Not really magic
2   Let K = U⇤VT. By the properties of the SVD, the first k columns of U form an
    orthonormal basis for the column space of K, so we can define
            Pk = U k U T  k            since s + s = P m + P m = s
                                               k    ?     k       ?
             P? = I    Pk

                                                                                  25
Signal Space Projection
                       M
                       X
• Recall m(t) =        i=1
                             ai (t)si + n(t) . To find a(t), invert s‖:


      m(t) = a(t)sk
       a(t) = sk 1 m(t)
       ˆ
                        1
          a = V⇤
          ˆ                 UT m(t)


• In practice, soften consists of known noise signals
                   ‖
  specific to a particular MEG scanner. The final step is
  simply to project those out of m(t), leaving only
  unknown (and presumably neural) sources in s.



                                                                         26
Signal Space Projection
                       M
                       X
• Recall m(t) =        i=1
                             ai (t)si + n(t) . To find a(t), invert s‖:


      m(t) = a(t)sk
       a(t) = sk 1 m(t)
       ˆ                                   Recall that K = {s1 , s2 , . . . , sk } 2 sk

          a = V⇤ 1 UT m(t)
          ˆ                                                 = U⇤VT
              | {z }


• In practice, soften consists of known noise signals
                   ‖
  specific to a particular MEG scanner. The final step is
  simply to project those out of m(t), leaving only
  unknown (and presumably neural) sources in s.



                                                                                     26
Signal Space Separation
         (SSS)




                          27
Signal Space Separation
• Overview: Separate MEG signal into sources (1)
  outside and (2) inside the MEG helmet

• Theory: Analyzing the MEG data using a basis which
  expresses the magnetic field as a “gradient of the
  harmonic scalar potential” (defined below) allows the
  field to be separated into internal and external
  components.

  By simply dropping the external component, we can
  significantly reduce the MEG signal noise.




                                                         28
MEG data – raw




                 29
MEG data – SSP




                 30
MEG data – SSS




                 31
Signal Space Separation
• Begin with Maxwell’s laws:
       ⇤⇥H=J                   (1)
       ⇤ ⇥ B = µ0 J            (2)
        ⇤·B=0                  (3)




                                     32
Signal Space Separation
     • Begin with Maxwell’s laws:
            ⇤⇥H=J                             (1)
magnetic    ⇤ ⇥ B = µ0 J            sources   (2)
  field
             ⇤·B=0                            (3)




                                                    32
Signal Space Separation
     • Begin with Maxwell’s laws:
            ⇤⇥H=J                                         (1)
magnetic    ⇤ ⇥ B = µ0 J            sources                (2)
  field                                        i. e., nos!
                                                      ce
             ⇤·B=0                            sour         (3)
     • Note that on surface of sensor array, J = 0. As such,
            ⇥   H = 0 on array surface




                                                                 Taulu et al, 2005


                                                                                     32
Signal Space Separation
     • Begin with Maxwell’s laws:
             ⇤⇥H=J                                                     (1)
magnetic      ⇤ ⇥ B = µ0 J                 sources                      (2)
  field                                                     i. e., nos!
                                                                   ce
               ⇤·B=0                                       sour         (3)
     • Note that on surface of sensor array, J = 0. As such,
             ⇥     H = 0 on array surface
     • Defining H = ∇Ψ, we obtain the identity ∇ × ∇Ψ = 0 in (1).
       This term (∇Ψ) is called the “scalar potential.”
       •   “Scalar potential” has no physical correlate.

       •   Often written with a negative sign (–∇Ψ) for convenience.

       •   H = –∇Ψ → B = –μ0∇Ψ… used interchangeably




                                                                              Taulu et al, 2005


                                                                                                  32
Signal Space Separation
     • Begin with Maxwell’s laws:
             ⇤⇥H=J                                                     (1)
magnetic      ⇤ ⇥ B = µ0 J                 sources                      (2)
  field                                                     i. e., nos!
                                                                   ce
               ⇤·B=0                                       sour         (3)
     • Note that on surface of sensor array, J = 0. As such,
             ⇥     H = 0 on array surface
     • Defining H = ∇Ψ, we obtain the identity ∇ × ∇Ψ = 0 in (1).
       This term (∇Ψ) is called the “scalar potential.”
       •   “Scalar potential” has no physical correlate.

       •   Often written with a negative sign (–∇Ψ) for convenience.

       •   H = –∇Ψ → B = –μ0∇Ψ… used interchangeably

     • Substituting scalar potential into (3) we obtain the Laplacian:
             ⇥ ·⇥      = ⇥2       =0
                                                                              Taulu et al, 2005


                                                                                                  32
Signal Space Separation
• Substituting the scalar potential into (3), we obtain the
  Laplacian:
                                      ⇥·B=0
      ⇥ ·⇥     = ⇥2   =0




                                                              33
Signal Space Separation
• Substituting the scalar potential into (3), we obtain the
  Laplacian:
                                                   ⇥·B=0
      ⇥ ·⇥     = ⇥2 = 0
                 |{z}
                                            ✓       ◆      ✓          ◆
                            1             @       @      @          @        1 @2
                                    sin ✓      r2      +      sin ✓      +               + K2   =0
                         r2 sin ✓         @r      @r     @✓         @✓     sin ✓ @   2




• We can express the scalar potential using spherical
  coordinates ( Ψ(Φ, θ, r) ), separate the variables
  ( Ψ(Φ,θ,r) = Φ(φ)Θ(θ)R(r) ), and solve the harmonic
  to obtain
                     ⇥    l                                   ⇥       l
                                       lm (⇥, ⌅)
                                                                                         lm (⇥, ⌅)
                                                                                     l
      B(r) =    µ0               lm                      µ0                  lm r
                                        rl+1                  l=0 m= l
                     l=0 m= l

             ⇥ B (r) + B (r)
  internal                            external
   signal                              signal


                                                                                                 33
Signal Space Separation
• Substituting the scalar potential into (3), we obtain the
  Laplacian:
                                                        ⇥·B=0
      ⇥ ·⇥         = ⇥2 = 0
                     |{z}
                                                 ✓       ◆      ✓          ◆
                                 1             @       @      @          @        1 @2
                                         sin ✓      r2      +      sin ✓      +               + K2   =0
                              r2 sin ✓         @r      @r     @✓         @✓     sin ✓ @   2




• We can express the scalar potential using spherical
  coordinates ( Ψ(Φ, θ, r) ), separate the variables
  ( Ψ(Φ,θ,r) = Φ(φ)Θ(θ)R(r) ), and solve the harmonic
  to obtain
                          ⇥    l
                                            lm (⇥, ⌅)
      B(r) =         µ0               lm
       internal
                                             rl+1
                          l=0 m= l

                  ⇥ B (r)
  internal
   signal


                                                                                                      33
Signal Space Separation




                          34
Temporally-extended
Signal Space Separation
         (tSSS)



                          35
Temporally-extended Signal Space Separation

Conceptually very simple:




                                              36
Temporally-extended Signal Space Separation

Conceptually very simple:

• Recall that the SSS algorithm ends with two signal
  components – Bα(r) and Bβ(r), or Bin(r) and Bout(r) –
  and we discard the Bout(r) component
  -   Rationale: signals originating outside MEG sensor helmet
      cannot be brain signal




                                                                 36
Temporally-extended Signal Space Separation

Conceptually very simple:

• Recall that the SSS algorithm ends with two signal
  components – Bα(r) and Bβ(r), or Bin(r) and Bout(r) –
  and we discard the Bout(r) component
  -   Rationale: signals originating outside MEG sensor helmet
      cannot be brain signal

• tSSS looks for correlations between B       out(r)   and Bin(r)
  and projects those correlations out of Bin(r)
  -   Rationale: Any internal signal correlated with the external
      noise component must represent noise that leaked into the
      Bin(r) component



                                                                    36
Temporally-extended Signal Space Separation

• From the
  original article:




                                              37
Temporally-extended Signal Space Separation

• From the original article:




                                              38
Temporally-extended Signal Space Separation

• Without tSSS:




                                              39
Temporally-extended Signal Space Separation

• With tSSS:




                                              40
Source Separation
   Algorithms




                    41
Primary Component
   Analysis (PCA)




                    42
• Ordinary Least
             Squares (OLS)
             regression of X
             to Y




Following five plots from http://stats.stackexchange.com/a/2700/2019


                                                                      43
• Ordinary Least
  Squares (OLS)
  regression of Y
  to X




                    44
• Regression lines
  are different!




                     45
• PCA minimizes
 error orthogonal
 to the model line




                     (Yes, this is a different dataset)
                                                          46
Primary Component Analysis



• “Most accurate”
 regression line
 for the data




                    (Yes, this is another different dataset)
                                                               47
PCA – Formal Definition




                         48
PCA – Formal Definition




     http://stat.ethz.ch/~maathuis/teaching/fall08/Notes3.pdf



                                                                49
PCA – Formal Definition




     http://stat.ethz.ch/~maathuis/teaching/fall08/Notes3.pdf



                                                                49
PCA shortcomings
• Will only detect
     orthogonal signals




                                                                   “A Tutorial on Principal Component Analysis”, Jonathon Shlens, April 2009



•
                                                             • Cannot detect
                                                               polymodal distributions



Appl. Environ. Microbiol. May 2007 vol. 73 no. 9 2878-2890

                                                                                                                                               50
Independent Component
     Analysis (ICA)




                        51
Independent Component Analysis
• Assumptions: Each signal is…
 1. Statistically independent
 2. Non-gaussian

• Recall Central Limit Theorem:
  ! “Given independent random variables x + y = z, z is
  ! more gaussian than x or y.”

• Theory: We can find S by iteratively identifying and
  extracting the most independent and non-gaussian
  components of X




                                                          52
ICA in FieldTrip package




                           53
ICA – Mixing matrix




                      54
ICA – Mixing matrix




           s2
    s1




                      54
ICA – Mixing matrix




                s2
         s1

                           x2
x1




                                54
ICA – Mixing matrix
      x1 = a11 s1 + a12 s2
                              ⌘ x = As
      x2 = a21 s1 + a22 s2




                         s2
             s1

                                         x2
x1




                                              54
ICA – Mixing matrix
      x1 = a11 s1 + a12 s2
                              ⌘ x = As
      x2 = a21 s1 + a22 s2




                         s2
             s1

                                         x2
x1    Goal: Separate s1 and s2 using
       information from x1 and x2



                                              54
Independent Component Analysis
• Consider the general mixing equation:
                               9
      x1   = a11 s1 + . . . + a1n sn >
                                     =
       .
       .     .
       .   = .
             .                       >
                                       ⌘ x = As
                                     ;
      xn   = an1 s1 + . . . + ann sn




                                                  55
Independent Component Analysis
• Consider the general mixing equation:
                               9                      mixing
      x1   = a11 s1 + . . . + a1n sn >                matrix
                                     =
       .
       .     .
       .   = .
             .                       >
                                       ⌘ x = As
                                     ;            sources
      xn   = an1 s1 + . . . + ann sn
                                     sensors




                                                               55
Independent Component Analysis
• Consider the general mixing equation:
                               9                       mixing
      x1    = a11 s1 + . . . + a1n sn >                matrix
                                      =
       .
       .      .
       .    = .
              .                       >
                                        ⌘ x = As
                                      ;            sources
      xn    = an1 s1 + . . . + ann sn
                                      sensors

• If we could find one of the rows of A  (let’s call that
                                           -1

  vector w), we could reconstruct a row of s.
  Mathematically:
               X
        T
      w x=          w i xi = y
                i




                                                                55
Independent Component Analysis
    • Consider the general mixing equation:
                                   9                        mixing
            x1   = a11 s1 + . . . + a1n sn >                matrix
                                           =
             .
             .     .
             .   = .
                   .                       >
                                             ⌘ x = As
                                           ;            sources
            xn   = an1 s1 + . . . + ann sn
                                           sensors

    • If we could find one of the rows of A   (let’s call that
                                                -1

       vector w), we could reconstruct a row of s.
       Mathematically:
                    X
             T
            w x=         w i xi = y
                     i

        w
Some ro-1
 from A



                                                                     55
Independent Component Analysis
    • Consider the general mixing equation:
                                   9                        mixing
            x1   = a11 s1 + . . . + a1n sn >                matrix
                                           =
             .
             .     .
             .   = .
                   .                       >
                                             ⌘ x = As
                                           ;            sources
            xn   = an1 s1 + . . . + ann sn
                                           sensors

    • If we could find one of the rows of A   (let’s call that
                                                -1

       vector w), we could reconstruct a row of s.
       Mathematically:
                                                         e ICs
                     X                         One of th mponents)
                                                     t co
            wT x =       w i xi = y      ( independen ake up S
                     i                          that m

        w
Some ro-1
 from A



                                                                     55
Independent Component Analysis
                                           X
                                   T
                                  w x=         w i xi = y
• Working through the math… let   x = As
                                           i


      z = AT w




                                                       56
Independent Component Analysis
                                                      X
                                              T
                                             w x=         w i xi = y
        • Working through the math… om A
                                      let    x = As
                                                      i


mixing matrix             Some row fr   -1
                   T
                z=A w




                                                                  56
Independent Component Analysis
                                                      X
                                              T
                                             w x=         w i xi = y
        • Working through the math… om A
                                      let    x = As
                                                      i


mixing matrix z = A w     Some row fr
                                        -1
                  T

        • So, y = w x
                  T

              = wT As
              = zT s




                                                                  56
Independent Component Analysis
                                                      X
                                              T
                                             w x=         w i xi = y
        • Working through the math… om A
                                      let    x = As
                                                      i


mixing matrix z = A w     Some row fr
                                        -1
                  T

        • So, y = w x
                  T


 One of       = wT As
 the ICs      = zT s




                                                                  56
Independent Component Analysis
                                                      X
                                              T
                                             w x=         w i xi = y
        • Working through the math… om A
                                      let    x = As
                                                      i


mixing matrix z = A w     Some row fr
                                        -1
                  T

        • So, y = w x
                  T


 One of       = wT As
 the ICs      = zT s




                                                                  56
Independent Component Analysis
                                                      X
                                              T
                                             w x=         w i xi = y
        • Working through the math… om A
                                      let    x = As
                                                      i


mixing matrix z = A w     Some row fr
                                        -1
                  T

        • So, y = w x
                  T


 One of       = wT As
 the ICs      = zT s




                                                                  56
Independent Component Analysis
                                                      X
                                              T
                                             w x=         w i xi = y
        • Working through the math… om A
                                      let    x = As
                                                      i


mixing matrix z = A w     Some row fr
                                        -1
                  T

        • So, y = w x
                  T


 One of       = wT As
 the ICs      = zT s




                                                                  56
Independent Component Analysis
                                                                     X
                                                             T
                                                           w x=          w i xi = y
        • Working through the math… om A
                                      let                  x = As
                                                                     i


mixing matrix z = A w     Some row fr
                                             -1
                    T

        • So, y = w x
                   T


 One of        = wT As
 the ICs       = zT s

       • y (an IC) is a linear combination of s, with weights z .T




                                                                                 56
Independent Component Analysis
                                                                       X
                                                               T
                                                             w x=          w i xi = y
        • Working through the math… om A
                                      let                    x = As
                                                                       i


mixing matrix z = A w     Some row fr
                                               -1
                     T

        • So, y = w x
                    T


 One of         = wT As
 the ICs        = zT s

       • y (an IC) is a linear combination of s, with weights z .  T


       • Recall Central Limit Theorem:
           ! “Given independent random variables x + y = z, z is
           ! more gaussian than x or y.”
           zT is more gaussian than any of si, and is least gaussian
           when equal to one of the si.

                                                                                   56
Independent Component Analysis
                                                                      X
                                                              T
                                                            w x=          w i xi = y
      • Working through the math… let
                    T
                                                            x = As
                                                                      i


             z=A w
      • So, y = w xT
                          We want to take w        as a vector that
                                                   T

                                 maximizes the nongaussianity of
One of         = wT As            wTx, ensuring that wTx = zTs
the ICs        = zT s

      • y (an IC) is a linear combination of s, with weights z .  T


      • Recall Central Limit Theorem:
          ! “Given independent random variables x + y = z, z is
          ! more gaussian than x or y.”
          zT is more gaussian than any of si, and is least gaussian
          when equal to one of the si.

                                                                                  56
Independent Component Analysis
• How can we find w    Tso as to maximize the
  nongaussianity of wTx?

• Numerous methods:
 -   Kurtosis

 -   Negentropy

 -   Approximations of Negentropy

• Once find, similar to PCA… find w , remove, find next
                                    T

  best wT, remove, repeat until no more sensors
  available.



                                                       57
ICA in Fieldtrip (2)




                       58
Mantini, Franciotti, Romani, & Pizzella (2007)

                                             59
Mantini, Franciotti, Romani, & Pizzella (2007)

                                                 1
Mantini, Franciotti, Romani, & Pizzella (2007)

                                             61
ICA – Method Comparison




              Zavala-Fernández, Sander, Burghoff, Orglmeister, & Trahms (2006)

                                                                                 62
Summary
• Examine your data in as many ways as possible
• Use SSS & tSSS to best clean data
• Use ICA to find specific artifacts
• Always check your data!




                                                  63
Questions?
             64

Mais conteúdo relacionado

Mais procurados

13727 відкритий урок 5 кл
13727 відкритий урок 5 кл13727 відкритий урок 5 кл
13727 відкритий урок 5 кл
urvlan
 
7 г задачі на побудову_шевченко
7 г задачі на побудову_шевченко7 г задачі на побудову_шевченко
7 г задачі на побудову_шевченко
daniil chilochi
 
Лінійна функція
Лінійна функціяЛінійна функція
Лінійна функція
Lenchik11111
 

Mais procurados (20)

Biosignal Processing
Biosignal ProcessingBiosignal Processing
Biosignal Processing
 
Nuclear Battery
Nuclear BatteryNuclear Battery
Nuclear Battery
 
EEG-132 Pract. (1).ppt
EEG-132 Pract. (1).pptEEG-132 Pract. (1).ppt
EEG-132 Pract. (1).ppt
 
EEG Signal processing
EEG Signal processing EEG Signal processing
EEG Signal processing
 
ECG Classification using SVM
ECG Classification using SVMECG Classification using SVM
ECG Classification using SVM
 
Biomedical signal modeling
Biomedical signal modelingBiomedical signal modeling
Biomedical signal modeling
 
13727 відкритий урок 5 кл
13727 відкритий урок 5 кл13727 відкритий урок 5 кл
13727 відкритий урок 5 кл
 
Removal of artifacts in EEG by averaging and
Removal of artifacts in EEG by averaging andRemoval of artifacts in EEG by averaging and
Removal of artifacts in EEG by averaging and
 
Laser lecture 01
Laser lecture 01Laser lecture 01
Laser lecture 01
 
7 г задачі на побудову_шевченко
7 г задачі на побудову_шевченко7 г задачі на побудову_шевченко
7 г задачі на побудову_шевченко
 
Surface acoustic wave technology
Surface acoustic wave technologySurface acoustic wave technology
Surface acoustic wave technology
 
Magnetoencephalogram-MEG
Magnetoencephalogram-MEGMagnetoencephalogram-MEG
Magnetoencephalogram-MEG
 
Magnetoencephalography
MagnetoencephalographyMagnetoencephalography
Magnetoencephalography
 
fNIRS data analysis
fNIRS data analysisfNIRS data analysis
fNIRS data analysis
 
Artifact detection and removal
Artifact detection and removalArtifact detection and removal
Artifact detection and removal
 
Basic Electron Devices (Multi-Choice Questions) - Mathankumar.S (VMKVEC)
Basic Electron Devices (Multi-Choice Questions) - Mathankumar.S (VMKVEC)Basic Electron Devices (Multi-Choice Questions) - Mathankumar.S (VMKVEC)
Basic Electron Devices (Multi-Choice Questions) - Mathankumar.S (VMKVEC)
 
Лінійна функція
Лінійна функціяЛінійна функція
Лінійна функція
 
Dsp lecture vol 7 adaptive filter
Dsp lecture vol 7 adaptive filterDsp lecture vol 7 adaptive filter
Dsp lecture vol 7 adaptive filter
 
EEG basics for mental health professionals
EEG basics for mental health professionalsEEG basics for mental health professionals
EEG basics for mental health professionals
 
EEG (ELECTROENCEPALOGRAM)
EEG (ELECTROENCEPALOGRAM)EEG (ELECTROENCEPALOGRAM)
EEG (ELECTROENCEPALOGRAM)
 

Destaque

Magnetoencephalography an emerging biological marker for neurodegenerative an...
Magnetoencephalography an emerging biological marker for neurodegenerative an...Magnetoencephalography an emerging biological marker for neurodegenerative an...
Magnetoencephalography an emerging biological marker for neurodegenerative an...
Adonis Sfera, MD
 
Instrumental lecture 2
Instrumental lecture 2Instrumental lecture 2
Instrumental lecture 2
esmail_alwrafi
 
Magnetoencephalography (meg) and diffusion tensor imaging
Magnetoencephalography (meg) and diffusion tensor imagingMagnetoencephalography (meg) and diffusion tensor imaging
Magnetoencephalography (meg) and diffusion tensor imaging
Adonis Sfera, MD
 
Blood Circulation In Human Heart
Blood Circulation In Human HeartBlood Circulation In Human Heart
Blood Circulation In Human Heart
eitkan
 
Basics of ECG.ppt dr.k.subramanyam
Basics of ECG.ppt dr.k.subramanyamBasics of ECG.ppt dr.k.subramanyam
Basics of ECG.ppt dr.k.subramanyam
Adarsh
 
fMRI terms: HRF and BOLD
fMRI terms: HRF and BOLDfMRI terms: HRF and BOLD
fMRI terms: HRF and BOLD
Russell James
 

Destaque (19)

MEG-Array® Connector System
MEG-Array® Connector SystemMEG-Array® Connector System
MEG-Array® Connector System
 
Magnetic activity of brain
Magnetic activity of brain Magnetic activity of brain
Magnetic activity of brain
 
Magnetoencephalography an emerging biological marker for neurodegenerative an...
Magnetoencephalography an emerging biological marker for neurodegenerative an...Magnetoencephalography an emerging biological marker for neurodegenerative an...
Magnetoencephalography an emerging biological marker for neurodegenerative an...
 
Electrocardiogram-ECG
Electrocardiogram-ECGElectrocardiogram-ECG
Electrocardiogram-ECG
 
Technological Innovations in Neurology 2 - Sanjoy Sanyal
Technological Innovations in Neurology 2 - Sanjoy SanyalTechnological Innovations in Neurology 2 - Sanjoy Sanyal
Technological Innovations in Neurology 2 - Sanjoy Sanyal
 
Instrumental lecture 2
Instrumental lecture 2Instrumental lecture 2
Instrumental lecture 2
 
Magnetoencephalography (meg) and diffusion tensor imaging
Magnetoencephalography (meg) and diffusion tensor imagingMagnetoencephalography (meg) and diffusion tensor imaging
Magnetoencephalography (meg) and diffusion tensor imaging
 
Simulation results of induction heating coil
Simulation results of induction heating coilSimulation results of induction heating coil
Simulation results of induction heating coil
 
Modern Approaches to Heart Disease Detection - Carl H. Rosner, CardioMag Imag...
Modern Approaches to Heart Disease Detection - Carl H. Rosner, CardioMag Imag...Modern Approaches to Heart Disease Detection - Carl H. Rosner, CardioMag Imag...
Modern Approaches to Heart Disease Detection - Carl H. Rosner, CardioMag Imag...
 
Electrocardiogram (ECG or EKG)
Electrocardiogram (ECG or EKG)Electrocardiogram (ECG or EKG)
Electrocardiogram (ECG or EKG)
 
PCI Express* based Storage: Data Center NVM Express* Platform Topologies
PCI Express* based Storage: Data Center NVM Express* Platform TopologiesPCI Express* based Storage: Data Center NVM Express* Platform Topologies
PCI Express* based Storage: Data Center NVM Express* Platform Topologies
 
Moving to PCI Express based SSD with NVM Express
Moving to PCI Express based SSD with NVM ExpressMoving to PCI Express based SSD with NVM Express
Moving to PCI Express based SSD with NVM Express
 
Blood Circulation In Human Heart
Blood Circulation In Human HeartBlood Circulation In Human Heart
Blood Circulation In Human Heart
 
Shortcut to ECG
Shortcut to ECGShortcut to ECG
Shortcut to ECG
 
ECG
ECGECG
ECG
 
Basics of ECG.ppt dr.k.subramanyam
Basics of ECG.ppt dr.k.subramanyamBasics of ECG.ppt dr.k.subramanyam
Basics of ECG.ppt dr.k.subramanyam
 
ECG Basics
ECG BasicsECG Basics
ECG Basics
 
Computer Tomography (CT Scan)
Computer Tomography (CT Scan)Computer Tomography (CT Scan)
Computer Tomography (CT Scan)
 
fMRI terms: HRF and BOLD
fMRI terms: HRF and BOLDfMRI terms: HRF and BOLD
fMRI terms: HRF and BOLD
 

Último

CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
giselly40
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
Enterprise Knowledge
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
Earley Information Science
 

Último (20)

Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdf
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 

Meg preprocessing

  • 1. Magnetoencephalography Preprocessing and Noise Reduction Techniques Eliezer Kanal 2/20/2012 MEG Basics Course 1
  • 2. About Me • 2005 -! 2009!!! ! ! ! ! ! University of Pittsburgh PhD, Bioengineering • 2009 -! 2011!!! ! ! ! ! ! Carnegie Mellon University Postdoctoral fellow, CNBC • 2011 -! current! !! ! ! ! ! PNC Financial Services Quantitative Analyst, Risk Analytics 2
  • 3. Dealing with Noisy Data • Overview of MEG Noise • Noise Reduction - Averaging, thresholding, frequency filters - SSP - SSS/tSSS • Source Extraction - PCA - ICA 3
  • 10. Biological Noise Vigário, Jousmäki, Hämäläinen, Hari, & Oja (1997) 10
  • 11. Line Noise 50 Hz Line Noise (60 Hz in USA) Subject Empty Room 11
  • 12. Bad Channels Find the bad one: 12
  • 13. Bad Channels Find the bad one: 12
  • 14. Noise from nearby construction 13
  • 15. Noise Reduction Techniques • Averaging, thresholding, frequency filters • SSP • SSS/tSSS 14
  • 16. Averaging • Removes non-timelocked noise • Requires: - Time-locked block paradigm design - Temporal or low-frequency analyses 15
  • 17. Thresholding • Discarding trials/channels with maximum signal intensity greater than some user-defined value • Removes most “data blips” • Rudimentary, better technique is to simply examine each trial/channel 16
  • 18. Frequency Filter Filter Removes… High-pass Lower frequencies Low-pass Higher frequencies Band-pass Outside specified band Notch All except specified • Very good first step, remove data you won’t analyze (don’t waste time cleaning what you won’t examine) • Use more advanced techniques for specific noise signals 17
  • 19. 18
  • 20. 19
  • 22. Signal Space Projection • Overview: SSP uses the difference between source orientations and locations to differentiate distinct sources. • Theory: Since the field pattern from a single source is 1) unique 2) time-invariant, we can differentiate sources by examining the angle between their “signal space representations”, and project noise signals out of the dataset. 21
  • 23. 22
  • 24. 23
  • 25. Signal Space Projection • In general, M X m(t) = ai (t)si + n(t) i=1 24
  • 26. Signal Space Projection • In general, M X m(t) = ai (t)si + n(t) measured i=1 signal 24
  • 27. Signal Space Projection • In general, source i M X M = Total number of channels m(t) = ai (t)si + n(t) measured i=1 signal 24
  • 28. Signal Space Projection source • In general, amplitude source i M X M = Total number of channels m(t) = ai (t)si + n(t) measured i=1 signal 24
  • 29. Signal Space Projection source • In general, amplitude source i M X M = Total number of channels m(t) = ai (t)si + n(t) noise measured i=1 signal 24
  • 30. Signal Space Projection source • In general, amplitude source i M X M = Total number of channels m(t) = ai (t)si + n(t) noise measured i=1 signal • SSP states that s can be split in two: - s‖ ! = signals from known sources - s⟂ ! = signals from unknown sources s k = Pk m s ? = P? m 24
  • 31. Signal Space Projection source • In general, amplitude source i M X M = Total number of channels m(t) = ai (t)si + n(t) noise measured i=1 signal • SSP states that s can be split in two: - s‖ ! = signals from known sources - s⟂ ! = signals from unknown sources known s k = Pk m sources MEG signal s ? = P? m unknown sources Projection operators 24
  • 32. Signal Space Projection source • In general, amplitude source i M X M = Total number of channels m(t) = ai (t)si + n(t) noise measured i=1 signal • SSP states that s can be split in two: - s‖ ! = signals from known sources - s⟂ ! = signals from unknown sources known s k = Pk m sources MEG signal s ? = P? m unknown sources Projection Worth mentioning that sk + s? = s operators 24
  • 33. Signal Space Projection How find P‖ and P⟂? 25
  • 34. Signal Space Projection How find P‖ and P⟂? • Ingenious application of the magic 1 technique of Singular Value Decomposition (SVD) 1 Not really magic 25
  • 35. Signal Space Projection How find P‖ and P⟂? • Ingenious application of the magic technique of 1 Singular Value Decomposition (SVD) a matrix of all known sources • Let K = {s , s , . . . , s } 2 s . Using SVD, we find a basis 1 2 k k for s‖, and therefore P‖.2 1 Not really magic 25
  • 36. Signal Space Projection How find P‖ and P⟂? • Ingenious application of the magic technique of 1 Singular Value Decomposition (SVD) a matrix of all known sources • Let K = {s , s , . . . , s } 2 s . Using SVD, we find a basis 1 2 k k for s‖, and therefore P‖.2 1 Not really magic 2 Let K = U⇤VT. By the properties of the SVD, the first k columns of U form an orthonormal basis for the column space of K, so we can define Pk = U k U T k since s + s = P m + P m = s k ? k ? P? = I Pk 25
  • 37. Signal Space Projection M X • Recall m(t) = i=1 ai (t)si + n(t) . To find a(t), invert s‖: m(t) = a(t)sk a(t) = sk 1 m(t) ˆ 1 a = V⇤ ˆ UT m(t) • In practice, soften consists of known noise signals ‖ specific to a particular MEG scanner. The final step is simply to project those out of m(t), leaving only unknown (and presumably neural) sources in s. 26
  • 38. Signal Space Projection M X • Recall m(t) = i=1 ai (t)si + n(t) . To find a(t), invert s‖: m(t) = a(t)sk a(t) = sk 1 m(t) ˆ Recall that K = {s1 , s2 , . . . , sk } 2 sk a = V⇤ 1 UT m(t) ˆ = U⇤VT | {z } • In practice, soften consists of known noise signals ‖ specific to a particular MEG scanner. The final step is simply to project those out of m(t), leaving only unknown (and presumably neural) sources in s. 26
  • 40. Signal Space Separation • Overview: Separate MEG signal into sources (1) outside and (2) inside the MEG helmet • Theory: Analyzing the MEG data using a basis which expresses the magnetic field as a “gradient of the harmonic scalar potential” (defined below) allows the field to be separated into internal and external components. By simply dropping the external component, we can significantly reduce the MEG signal noise. 28
  • 41. MEG data – raw 29
  • 42. MEG data – SSP 30
  • 43. MEG data – SSS 31
  • 44. Signal Space Separation • Begin with Maxwell’s laws: ⇤⇥H=J (1) ⇤ ⇥ B = µ0 J (2) ⇤·B=0 (3) 32
  • 45. Signal Space Separation • Begin with Maxwell’s laws: ⇤⇥H=J (1) magnetic ⇤ ⇥ B = µ0 J sources (2) field ⇤·B=0 (3) 32
  • 46. Signal Space Separation • Begin with Maxwell’s laws: ⇤⇥H=J (1) magnetic ⇤ ⇥ B = µ0 J sources (2) field i. e., nos! ce ⇤·B=0 sour (3) • Note that on surface of sensor array, J = 0. As such, ⇥ H = 0 on array surface Taulu et al, 2005 32
  • 47. Signal Space Separation • Begin with Maxwell’s laws: ⇤⇥H=J (1) magnetic ⇤ ⇥ B = µ0 J sources (2) field i. e., nos! ce ⇤·B=0 sour (3) • Note that on surface of sensor array, J = 0. As such, ⇥ H = 0 on array surface • Defining H = ∇Ψ, we obtain the identity ∇ × ∇Ψ = 0 in (1). This term (∇Ψ) is called the “scalar potential.” • “Scalar potential” has no physical correlate. • Often written with a negative sign (–∇Ψ) for convenience. • H = –∇Ψ → B = –μ0∇Ψ… used interchangeably Taulu et al, 2005 32
  • 48. Signal Space Separation • Begin with Maxwell’s laws: ⇤⇥H=J (1) magnetic ⇤ ⇥ B = µ0 J sources (2) field i. e., nos! ce ⇤·B=0 sour (3) • Note that on surface of sensor array, J = 0. As such, ⇥ H = 0 on array surface • Defining H = ∇Ψ, we obtain the identity ∇ × ∇Ψ = 0 in (1). This term (∇Ψ) is called the “scalar potential.” • “Scalar potential” has no physical correlate. • Often written with a negative sign (–∇Ψ) for convenience. • H = –∇Ψ → B = –μ0∇Ψ… used interchangeably • Substituting scalar potential into (3) we obtain the Laplacian: ⇥ ·⇥ = ⇥2 =0 Taulu et al, 2005 32
  • 49. Signal Space Separation • Substituting the scalar potential into (3), we obtain the Laplacian: ⇥·B=0 ⇥ ·⇥ = ⇥2 =0 33
  • 50. Signal Space Separation • Substituting the scalar potential into (3), we obtain the Laplacian: ⇥·B=0 ⇥ ·⇥ = ⇥2 = 0 |{z}  ✓ ◆ ✓ ◆ 1 @ @ @ @ 1 @2 sin ✓ r2 + sin ✓ + + K2 =0 r2 sin ✓ @r @r @✓ @✓ sin ✓ @ 2 • We can express the scalar potential using spherical coordinates ( Ψ(Φ, θ, r) ), separate the variables ( Ψ(Φ,θ,r) = Φ(φ)Θ(θ)R(r) ), and solve the harmonic to obtain ⇥ l ⇥ l lm (⇥, ⌅) lm (⇥, ⌅) l B(r) = µ0 lm µ0 lm r rl+1 l=0 m= l l=0 m= l ⇥ B (r) + B (r) internal external signal signal 33
  • 51. Signal Space Separation • Substituting the scalar potential into (3), we obtain the Laplacian: ⇥·B=0 ⇥ ·⇥ = ⇥2 = 0 |{z}  ✓ ◆ ✓ ◆ 1 @ @ @ @ 1 @2 sin ✓ r2 + sin ✓ + + K2 =0 r2 sin ✓ @r @r @✓ @✓ sin ✓ @ 2 • We can express the scalar potential using spherical coordinates ( Ψ(Φ, θ, r) ), separate the variables ( Ψ(Φ,θ,r) = Φ(φ)Θ(θ)R(r) ), and solve the harmonic to obtain ⇥ l lm (⇥, ⌅) B(r) = µ0 lm internal rl+1 l=0 m= l ⇥ B (r) internal signal 33
  • 54. Temporally-extended Signal Space Separation Conceptually very simple: 36
  • 55. Temporally-extended Signal Space Separation Conceptually very simple: • Recall that the SSS algorithm ends with two signal components – Bα(r) and Bβ(r), or Bin(r) and Bout(r) – and we discard the Bout(r) component - Rationale: signals originating outside MEG sensor helmet cannot be brain signal 36
  • 56. Temporally-extended Signal Space Separation Conceptually very simple: • Recall that the SSS algorithm ends with two signal components – Bα(r) and Bβ(r), or Bin(r) and Bout(r) – and we discard the Bout(r) component - Rationale: signals originating outside MEG sensor helmet cannot be brain signal • tSSS looks for correlations between B out(r) and Bin(r) and projects those correlations out of Bin(r) - Rationale: Any internal signal correlated with the external noise component must represent noise that leaked into the Bin(r) component 36
  • 57. Temporally-extended Signal Space Separation • From the original article: 37
  • 58. Temporally-extended Signal Space Separation • From the original article: 38
  • 59. Temporally-extended Signal Space Separation • Without tSSS: 39
  • 60. Temporally-extended Signal Space Separation • With tSSS: 40
  • 61. Source Separation Algorithms 41
  • 62. Primary Component Analysis (PCA) 42
  • 63. • Ordinary Least Squares (OLS) regression of X to Y Following five plots from http://stats.stackexchange.com/a/2700/2019 43
  • 64. • Ordinary Least Squares (OLS) regression of Y to X 44
  • 65. • Regression lines are different! 45
  • 66. • PCA minimizes error orthogonal to the model line (Yes, this is a different dataset) 46
  • 67. Primary Component Analysis • “Most accurate” regression line for the data (Yes, this is another different dataset) 47
  • 68. PCA – Formal Definition 48
  • 69. PCA – Formal Definition http://stat.ethz.ch/~maathuis/teaching/fall08/Notes3.pdf 49
  • 70. PCA – Formal Definition http://stat.ethz.ch/~maathuis/teaching/fall08/Notes3.pdf 49
  • 71. PCA shortcomings • Will only detect orthogonal signals “A Tutorial on Principal Component Analysis”, Jonathon Shlens, April 2009 • • Cannot detect polymodal distributions Appl. Environ. Microbiol. May 2007 vol. 73 no. 9 2878-2890 50
  • 72. Independent Component Analysis (ICA) 51
  • 73. Independent Component Analysis • Assumptions: Each signal is… 1. Statistically independent 2. Non-gaussian • Recall Central Limit Theorem: ! “Given independent random variables x + y = z, z is ! more gaussian than x or y.” • Theory: We can find S by iteratively identifying and extracting the most independent and non-gaussian components of X 52
  • 74. ICA in FieldTrip package 53
  • 75. ICA – Mixing matrix 54
  • 76. ICA – Mixing matrix s2 s1 54
  • 77. ICA – Mixing matrix s2 s1 x2 x1 54
  • 78. ICA – Mixing matrix x1 = a11 s1 + a12 s2 ⌘ x = As x2 = a21 s1 + a22 s2 s2 s1 x2 x1 54
  • 79. ICA – Mixing matrix x1 = a11 s1 + a12 s2 ⌘ x = As x2 = a21 s1 + a22 s2 s2 s1 x2 x1 Goal: Separate s1 and s2 using information from x1 and x2 54
  • 80. Independent Component Analysis • Consider the general mixing equation: 9 x1 = a11 s1 + . . . + a1n sn > = . . . . = . . > ⌘ x = As ; xn = an1 s1 + . . . + ann sn 55
  • 81. Independent Component Analysis • Consider the general mixing equation: 9 mixing x1 = a11 s1 + . . . + a1n sn > matrix = . . . . = . . > ⌘ x = As ; sources xn = an1 s1 + . . . + ann sn sensors 55
  • 82. Independent Component Analysis • Consider the general mixing equation: 9 mixing x1 = a11 s1 + . . . + a1n sn > matrix = . . . . = . . > ⌘ x = As ; sources xn = an1 s1 + . . . + ann sn sensors • If we could find one of the rows of A (let’s call that -1 vector w), we could reconstruct a row of s. Mathematically: X T w x= w i xi = y i 55
  • 83. Independent Component Analysis • Consider the general mixing equation: 9 mixing x1 = a11 s1 + . . . + a1n sn > matrix = . . . . = . . > ⌘ x = As ; sources xn = an1 s1 + . . . + ann sn sensors • If we could find one of the rows of A (let’s call that -1 vector w), we could reconstruct a row of s. Mathematically: X T w x= w i xi = y i w Some ro-1 from A 55
  • 84. Independent Component Analysis • Consider the general mixing equation: 9 mixing x1 = a11 s1 + . . . + a1n sn > matrix = . . . . = . . > ⌘ x = As ; sources xn = an1 s1 + . . . + ann sn sensors • If we could find one of the rows of A (let’s call that -1 vector w), we could reconstruct a row of s. Mathematically: e ICs X One of th mponents) t co wT x = w i xi = y ( independen ake up S i that m w Some ro-1 from A 55
  • 85. Independent Component Analysis X T w x= w i xi = y • Working through the math… let x = As i z = AT w 56
  • 86. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix Some row fr -1 T z=A w 56
  • 87. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T = wT As = zT s 56
  • 88. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T One of = wT As the ICs = zT s 56
  • 89. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T One of = wT As the ICs = zT s 56
  • 90. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T One of = wT As the ICs = zT s 56
  • 91. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T One of = wT As the ICs = zT s 56
  • 92. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T One of = wT As the ICs = zT s • y (an IC) is a linear combination of s, with weights z .T 56
  • 93. Independent Component Analysis X T w x= w i xi = y • Working through the math… om A let x = As i mixing matrix z = A w Some row fr -1 T • So, y = w x T One of = wT As the ICs = zT s • y (an IC) is a linear combination of s, with weights z . T • Recall Central Limit Theorem: ! “Given independent random variables x + y = z, z is ! more gaussian than x or y.” zT is more gaussian than any of si, and is least gaussian when equal to one of the si. 56
  • 94. Independent Component Analysis X T w x= w i xi = y • Working through the math… let T x = As i z=A w • So, y = w xT We want to take w as a vector that T maximizes the nongaussianity of One of = wT As wTx, ensuring that wTx = zTs the ICs = zT s • y (an IC) is a linear combination of s, with weights z . T • Recall Central Limit Theorem: ! “Given independent random variables x + y = z, z is ! more gaussian than x or y.” zT is more gaussian than any of si, and is least gaussian when equal to one of the si. 56
  • 95. Independent Component Analysis • How can we find w Tso as to maximize the nongaussianity of wTx? • Numerous methods: - Kurtosis - Negentropy - Approximations of Negentropy • Once find, similar to PCA… find w , remove, find next T best wT, remove, repeat until no more sensors available. 57
  • 97. Mantini, Franciotti, Romani, & Pizzella (2007) 59
  • 98. Mantini, Franciotti, Romani, & Pizzella (2007) 1
  • 99. Mantini, Franciotti, Romani, & Pizzella (2007) 61
  • 100. ICA – Method Comparison Zavala-Fernández, Sander, Burghoff, Orglmeister, & Trahms (2006) 62
  • 101. Summary • Examine your data in as many ways as possible • Use SSS & tSSS to best clean data • Use ICA to find specific artifacts • Always check your data! 63
  • 102. Questions? 64