On the Dual Geometry of Laplacian Eigenfunctions

We discuss the geometry of Laplacian eigenfunctions $-\Delta \phi = \lambda \phi$ on compact manifolds $(M,g)$ and combinatorial graphs $G=(V,E)$. The 'dual' geometry of Laplacian eigenfunctions is well understood on $\mathbb{T}^d$ (identified with $…

Authors: Alex, er Cloninger, Stefan Steinerberger

On the Dual Geometry of Laplacian Eigenfunctions
ON THE DUAL GEOMETR Y OF LAPLA CIAN EIGENFUNCTIONS ALEXANDER CLONINGER AND STEF AN STEINERBERGER Abstract. W e discuss the geometry of Laplacian eigenfunctions − ∆ φ = λφ on compact man- ifolds ( M , g ) and com binatorial graphs G = ( V , E ). The ’dual’ geometry of Laplacian eigen- functions is well understo od on T d (identified with Z d ) and R n (which is self-dual). The dual geometry is of tremendous role in various fields of pure and applied mathematics. The purpose of our pap er is to point out a notion of similarity b et w een eigenfunctions that allows to recon- struct that geometry . Our measure of ’similarity’ α ( φ λ , φ µ ) b et w een eigenfunctions φ λ and φ µ is given by a global av erage of lo cal correlations α ( φ λ , φ µ ) 2 = k φ λ φ µ k − 2 L 2 Z M  Z M p ( t, x, y )( φ λ ( y ) − φ λ ( x ))( φ µ ( y ) − φ µ ( x )) dy  2 dx, where p ( t, x, y ) is the classical heat kernel and e − tλ + e − tµ = 1. This notion recov ers all classical notions of duality but is equally applicable to other (rough) geometries and graphs; many numerical examples in different continuous and discrete settings illustrate the result. 1. Introduction 1.1. In tro duction. The Laplacian eigenfunctions on T 2 are easily determined and given b y φ k = e i h k,x i , where k ∈ Z 2 is a lattice p oint. They are orthogonal in L 2 and allo w for a representation of a function f ∈ L 2 ( T 2 ). Ho wev er, it b ecomes v ery quic kly apparen t that the geometry of Z 2 is not only a conv enient enumeration but plays a fairly fundamental role itself. Examples include (1) a b eautiful inequalit y of Zygmund [34] stating that for any r > 0 X k k k = r | b f ( k ) | 2 ≤ 5 1 / 2 k f k 2 L 4 / 3 ( T 2 ) and, more generally , restriction phenomena in Harmonic Analysis. (2) the analysis of the nonlinear Schr¨ odinger equation (see e.g. [33]) iu t + ∆ u = | u | p − 1 u as well as other general nonlinear disp ersiv e equations, (3) the structure of pseudo-differential op erators [13], (4) the op erations of wa velets and sp ectral filters on images [14] (5) or, a p ersonal example [31], an inequality for f ∈ C 1 ( T 2 ) with mean v alue 0 k∇ f k L 2 ( T 2 )    ∂ x f + √ 2 ∂ y f    L 2 ( T 2 ) ≥ c k∇ f k 2 L 2 ( T 2 ) . It is clear that the torus T d is a sp ecial case. The same is true for R d whose eigenfunctions (now understo od in a distributional sense) satisfy the same additive relationship e i h m,x i e i h n,x i = e i h m + n,x i . This is one of the reasons why so m uch mathematical analysis is possible on T d and R d that cannot b e easily generalized. A somewhat philosophical question, inten tionally put v aguely , is whether or not, for a given manifold ( M , g ) or Graph G = ( V , E ), there is addi- tional structure in the eigenfunctions b eyond orthogonalit y and the eigenv alue. 2010 Mathematics Subje ct Classific ation. 35J05, 35P05, 42C10, 65T60, 81Q50, 94A11. Key wor ds and phr ases. Laplacian eigenfunctions, Dual Geometry , Pro duct formulas, T riple product, Quan tum Chaos, Harmonic Analysis on Graphs. 1 2 1.2. Graph Signal Pro cessing. This question has turned out to b e of increasing imp ortance in modern problems of data science for rather ob vious reasons: if w e are interested in either selecting or filtering out certain t yp es of substructures and if those substructures tend to b e connected to certain types of eigenv ectors or eigenfunctions, then one needs to understand that in terplay . Conv ersely , ’pro ximity’ of eigen vectors should be indicativ e of capturing the same kind of phenomenon. This is easily observed on T d and R d where eige n vectors e i h m,x i , e i h n,x i are ’close ’ if k m − n k is small and this corresp onds to ha ving oscillations p oint in roughly the same direction. The imp ortance of this elemen tary observ ation is difficult to ov erstate; tak e, for example, f ∈ L 2 ( R 2 ) and define g ∈ L 2 ( R 2 ) via restricting the F ourier transform b g ( ξ 1 , ξ 2 ) = χ ξ 1 + ξ 2 ≥ 0 ( ξ 1 , ξ 2 ) b f ( ξ 1 , ξ 2 ) , then this corresponds to a filter with an obvious geometric in terpretation. More precisely , the very notion of a F ourier m ultiplier (and, corresp ondingly en tire branches of mathematics) is intimately en tangled with this underlying geometry of the eigenfunctions of the Laplacian. Challenge (Graph Signal Pro cessing) . Giv en a Graph G = ( V , E ), define, if p ossible, an analogous geometry on its eigenv ectors. This is an absolutely fundamen tal problem, w e refer to [1, 2, 5, 7, 8, 9, 10, 12, 15, 17, 18, 23, 25, 28, 29, 30] for recent examples. Moreo ver, it is not exp ected that this is alw ays (or ev en generically) p ossible – even in Euclidean space, one would exp ect that eigenfunctions on generic domains do not ha v e an y distinguishing features except for their eigenv alue; this v ague statement is made precise in differen t wa ys in the study of quantum c haos [16, 21]. 1.3. Our con tribution. The purp ose of this pap er is to present a somewhat curious definition of a notion of affinity or similarity betw een tw o eigenfunctions (or eigenv ectors). This notion is identical on manifolds and graphs and results in a n umber in [0 , 1] with 1 indicating strong similarit y and 0 denoting weak similarity . This then allo ws us to tak e a finite set of eigenfunctions { φ 1 , . . . , φ n } and compute an n × n matrix A ∈ [0 , 1] n × n where A ij denotes the similarity of φ i and φ j . This is then in terpreted as the weigh ted affinity matrix of a complete weigh ted Graph on { φ 1 , . . . , φ n } that w e hop e enco des the underlying geometry of the eigenfunctions. W e then use a fairly standard visualization technique that embeds complete weigh ted graphs into Euclidean space in a geometry-preserving manner; there is some flexibilit y in this step and one could use any n umber of visualization tec hniques. The main purp ose of this pap er is to discuss this algorithm in detail and show that it rec o vers the geometry of the eigenfunctions in all classical cases where suc h a geometry exists. This has substantial theoretical and practical applications. (1) Practically , this allo ws us to group eigen vectors of a Laplacian in to v arious natural geomet- ric substructures b ey ond just ordering b y their eigen v alue. This is of ob vious significance in Graph Signal Pro cessing, as w ell as in the choice of eigenv ectors used to visualize a n umber of non-linear dimensionality reduction em b eddings. (2) Theoretically , it raises a curious connection b etw een the geometry of eigenfunctions, how the p oint wise pro duct φ i φ j spreads ov er the sp ectrum (i.e. what h φ i φ j , φ k i lo oks like as a sequence in k , see also [32]) and, through its definition as a lo cal corellation, the lo cal structure. This gives rise to a num b er of fascinating theoretical prolems. 2. A Notion of Similarity 2.1. Similarit y . W e define a quan titative notion of similarity 0 ≤ α ( φ λ , φ µ ) ≤ 1 b etw een t wo eigenfunctions. W e first define it on compact manifolds ( M , g ) without boundary (this assumption can b e dropped but simplifies exposition). W e denote the solution of the heat equation ( ∂ t − ∆) u ( t, x ) = 0 u (0 , x ) = f ( x ) b y e t ∆ f . This induces a heat k ernel p ( t, x, y ) satisfying  e t ∆ f  ( x ) = Z M p ( t, x, y ) f ( y ) dy . 3 W e ha ve p ( t, x, y ) = p ( t, y , x ) and conserv ation of the L 1 − mass implies that p ( t, x, · ) is a probability distribution. Moreo ver, V aradhan’s short-time asymptotic implies that for t small, p ( t, x, · ) is essen tially a Gaussian centered at x and scale ∼ √ t . W e introduce the follo wing measure of similarit y b etw een t w o eigenfunctions α ( φ λ , φ µ ) 2 = k φ λ φ µ k − 2 L 2 Z M  Z M p ( t, x, y )( φ λ ( y ) − φ λ ( x ))( φ µ ( y ) − φ µ ( x )) dy  2 dx, where t is the unique solution of e − tλ + e − tµ = 1 . This is not a metric. The main motiv ation for this quantit y is that (1) it can be interpreted as an av erage o v er lo cal correlations: eigenfunctions that behav e similar should lo ok lo cally similar in lots of places and (2) that it app eared somewhat naturally in studies on pro ducts of eigenfunctions [32] where it was shown to satisfy the identit y α ( φ λ , φ µ ) = k e t ∆ ( φ λ φ µ ) k k φ λ φ µ k L 2 for exactly e − tλ + e − tµ = 1. The second prop ert y immediately shows 0 ≤ α ( φ µ , φ λ ) ≤ 1 . Both local correlations as well as the diffusion under the heat equation (and thus, implicitly , the distribution in the sp ectrum) is measured by the same quantit y α . It b eing large implies strong lo cal correlations and frequency transp ort to low er frequencies (meaning that the expansion of the pro duct φ λ φ µ in to eigenfunctions contains some low-frequency con tributions) and, con versely , it b eing small implies frequency transp ort to higher frequencies (see [32] for details). This motiv ated it as an in teresting ob ject worth y of further study and the in vestigations reported on in this pape r. 2.2. Creating a Landscap e. W e observ e that the definition α ( φ µ , φ λ ) only gives rise to a w eighted graph, how ev er, we would very muc h lik e to understand its in trinsic structure. This leads to an a very substantial problem that is curren tly receiving a great deal of interest: ho w to ’accurately’ map vertices of a weigh ted graph to R d in such a w ay that vertices that are ’close to eac h other’ are also close in the em b edding and, conv ersely , vertices that are far a w a y are will be mapp ed to different regions in space (see Figure 1). W e often wan t d ∈ { 2 , 3 } for visualization purp oses. φ λ φ µ α ( φ λ , φ µ ) = ⇒ φ λ φ µ Figure 1. The problem of finding a mapping from the vertices of a w eighted graph to R d in such a wa y that ’nearby’ v ertices get mapp ed to ’nearby’ vertices is of substan tial interest in data science and also o ccurs in our con text. There are a n umber of metho ds for creating a lo w dimensional em b edding of points giv en some notion of distance or similarit y b et ween p oin ts [4, 11, 19, 20, 22, 26]. Ultimately , if the structure is well enco ded in the m utual distances, then it does not matter very m uc h which metho d is 4 used. Throughout this pap er, we only use one of the very simplest metho ds: let us en umerate the eigenfunctions under consideration by { 1 , . . . , n } and let A ij = α ( φ i , φ j ) b e the n × n -matrix con taining all m utual affinities. W e then map φ i → ( v 1 ,i , v 2 ,i , v 3 ,i ) , where v 1 , v 2 , v 3 are the three eigenv ectors corresp onding to the three largest (in absolute v alue) eigen v alues of the matrix A . The subscript denotes the i − th entry of the vector; in particular, for every eigenfunction φ i w e use the i − th en try in the first, second and third largest eigen vectors as ( x, y , z ) − co ordinates for a p oin t in R 3 . Sometimes it ma y be adv antageous to not use the first three eigenv ectors but the b est result is usually obtained b y picking three eigen v ectors asso ciated to the lo w est eigenv alues. W e will also sometimes use A ij = α ( φ i , φ j ) p for some p ≥ 1 whic h has the effect of making strong existing affinities ev en more pronounced b y disprop ortion- ately weak ening smaller affinities. W e recall that α ( φ i , φ j ) = k e t ∆ ( φ i φ j ) k L 2 k φ i φ j k L 2 requires the computation of a suitable t dep ending on the eigen v alues of φ i and φ j . W e emphasize that the v alue dep ends smo othly on t and is not very sensitive. Sometimes we will ev en use a fixed v alue t 0 , indep enden tly of the eigenv alues, for all computations to demonstrate robustness. Finally , at p oin ts in this section, w e also deal with eigenv ectors of the normalized graph Laplacian L defined on a finite collection of p oin ts { x 1 , . . . , x n } ⊂ R d . This w ould corresp ond to a n umerical computation of eigen vectors of a discretization of a contin uous manifold. W e follow standard pro cedure and define neighbors of p oin ts via the matrix K i,j = exp  −k x i − x j k 2 2 /σ 2  , where σ > 0 is a parameter that fixes a scale. The normalized graph Laplacian is then denoted L = Id n × n − D − 1 / 2 K D − 1 / 2 , where D i,j = ( P n j =1 K i,j , for i = j 0 , for i 6 = j . This normalized Laplacian has an eigendecomp osition which satisfies prop erties similar to those of the manifold Laplacian eigenfunctions; if the p oints are actually sampled from an underlying manifold, then this construction is known to conv erge to the contin uous Laplacian [3]. W e will use this notion of Graph Laplacian throughout the pap er when constructing affinities of eigen v ectors. 3. Numerical Examples The remainder of the pap er is devoted to the study of dual landscap es of Laplacian eigenfunctions and eigenv ectors obtained by the method outlined ab ov e. More precisely , we will consider (1) The one-dimensional T orus T (with endp oin ts identified) discretized to a cycle graph C n , the eigenfunctions are given by discrete approximations of sin ( k x ) and cos ( kx ). (2) Spherical harmonics φ m ` as eigenfunctions of S 2 ; we recov er both indices ` and m . (3) A standard rectangle [0 , 4] × [0 , 1] ⊂ R 2 . Eigenfunctions of the Laplacian are group ed b y sp ecifying the num b er of oscillations in each direction; the metho d reco vers this p erfectly . (4) W e then study more general cartesian pro ducts; the Laplacian eigenfunction of A × B is merely the pro duct of eigenfunctions on A and eigenfunctions on B . The underlying cartesian structure is p erfectly recov ered even if A, B are rather structure-less ob jects. (5) On the other end of the sp ectrum, we consider Erd˝ os-Ren yi random graphs. Laplacian eigenfunctions on these ob jects should not exhibit an y particular structure nor any distin- guishing features. W e recov er an ordering with resp ect to the eigenv alue. (6) W e conclude by demonstrating several surprising eigenfunctions landscap es, and discuss this metho d’s uses in exploratory sp ectral graph theory . 5 W e emphasize that, while most examples were computed on the known eigenfunctions, equiv alent landscap es are disco vered for the empirical eigen vectors of the graph Laplacian on the domain. 3.1. The one-dimensional T orus. W e b egin with the Laplacian eigenfunctions of the torus restricted to a uniform grid consisting of n = 100 equispaced p oints (resulting in n eigenv ectors with at least low-frequency eigenv ectors approximating the classical trigonometric functions). W e emphasize that esp ecially at high frequencies, we only hav e one sampling p oin t for a wa velength (whic h is v ery little). The method nonetheless remains functional (but the effect can b e seen in the affinit y matrix). Figure 2 shows the distance matrix A , Figure 3 sho ws the sp ectral embedding. Figure 2. Affinit y Matrix of the Eigenfunctions on a discretized T orus. The method clearly reco v ers the linear dual structure of the eigenfunctions. It also clearly states that eigenv ectors approximating sin k x and cos kx are quite different functions but orders them in nearb y p oin ts in the landscap e because they b ehav e similarly with resp ect to other eigenfunctions. Finally , the identit y exp  i h n 2 − k i j n  exp  i h n 2 + k i j n  = e 2 π ij = 1 is automatically discov ered. The em b edding of the fourth (p oint wise) p o wer of the affinity matrix (i.e. p = 4 as described abov e) shows a linear structure underlying the eigenv ectors. Figure 3. Em b edding of the distance matrix A 4 6 3.2. Spherical Harmonics. W e no w describ e the pro cess on spherical harmonics, i.e. the eigen- functions of the Laplacian on S 2 . They can b e separated in to levels φ m ` , where − ` ≤ m ≤ ` , with corresp onding eigenv alue ` ( ` + 1). W e generate these eigenfunctions by taking linear spacing of 181 p oin ts in b oth spherical co ordinate angles ( α, β ), which makes the eigenfunctions satisfy the orthogonalit y relation Z π 0 Z 2 π 0 φ m ` ( α, β ) φ m 0 ` 0 ( α, β ) sin( α ) dβ dα = δ `,` 0 δ m,m 0 . Figure 4 describ es an em b edding of the 256 low est frequency spherical harmonic functions ( ` ≤ 14). Figure 4. Em b edding of A (2) for the lo w frequency spherical harmonics. F or the sp ectral embedding, eac h point corresp onds to a s pecific spherical harmonic. Lines are dra wn to connect all φ m ` for a fixed ` , and the color of the node corresp onds to m/` . φ m ` and φ − m ` are effectively on top of one another in the em b edding, φ m ` for m < 0 are represented b y dots and φ m ` for m ≥ 0 are circles. The metho d clearly recov ers the level ` of the spherical harmonics. It also recov ers the relativ e ordering of the degree m of the spherical harmonic, with φ m ` and φ − m ` b eing found to b e quite differen t functions. Despite this, φ m ` and φ − m ` are ordered into nearb y p oints in the landscap e as they b ehav e similarly with respect to the rest of the spherical harmonics. 3.3. Cartesian Pro duct Structure. W e consider the rectangle [0 , 4] × [0 , 1] ⊂ R 2 . The eigen- functions of the Laplacian with Dirichlet b oundary conditions are given b y φ mn = sin  mx 4  sin  nx 1  with corresp onding eigen v alue λ mn = m 2 16 + n 2 1 2 . Figure 5. Low frequency eigenfunctions of the Laplacian on a rectangular grid, with images of each eigenv ector displa y ed in their resp ective positions. 7 The precise geometric structure, how many times an eigenfunction oscillates in the x − direction vs. ho w many times it oscillates in the y − direction cannot b e understo o d by eigenv alue alone (esp ecially in the high-frequency limit, these points are equally spaced on an ellipse). Our metho d clearly recov ers the underlying oscillation structure and orders eigenfunctions accordingly . Figure 6. Em b edding of eigenfunctions on [0 , 4] × [0 , 1]. W e restricted the exact eigenfunctions to n = 400 grid p oin ts arranged as a 40 × 10 rectangle. The sp ectral embedding reflects the separable relationship b etw een eigenfunctions oscillating in differing directions, with the 10 lines of 40 eigenfunctions p erfectly grouping φ mn . Figure 6 displa ys the full landscap e of the 400 eigenfunctions. Figure 5 also displays the the embedding for 1 ≤ m ≤ 5 and 1 ≤ n ≤ 10 with each p oint b eing the image of the resp ectiv e φ mn , in order to demonstrate that the sp ectral embedding do es in fact organize the eigenfunctions correctly . 3.4. More General Cartesian Pro ducts. The reconstruction of tensor product geometry and separable eigenfunctions holds at a muc h greater level of generality . W e sample a set X of 100 Figure 7. Em b edding of eigenv ectors reco v ers X × Y with x i ∼ N (0 , σ 2 ). p oin ts in R 2 from a Gaussian distribution X ∼ N (0 , σ 2 I d ), where σ 2 = 0 . 01, then tak e the set Y of 10 equispaced p oints from [0 , 1] and consider the set X × Y . The Cartesian Product Structure is perfectly reco vered, the result is sho wn in Figure 7. The metho d clearly recov ers the 8 frequency of oscillation in the Y direction, and separates the eigenfunctions into different groups that constructively interfere with one another. In particular, the p oin t at the tail of the k th line corresp onds to the eigenfunction that is constant on X and v aries k times in the Y direction. 3.5. Ob jects without Structure. W e now consider an Erd˝ os-Renyi Graph G (1000 , 0 . 2). This is a random ob ject on n = 1000 vertices with likelihoo d of tw o vertices b eing connected b eing p = 0 . 2. W e expect this ob ject to b e completely random and eigenfunctions to all behav e in a fairly uniform manner. Results of this flav or ha ve b een of great in terest recently . W e refer, for example, to a rather general result of Rudelson & V ershynin [24] who pro ve that eigen v ectors of graphs with i.i.d. random en tries are uniformly flat in the sense of not ha ving any entries substan tially larger than ∼ n − 1 / 2 (up to logarithmic factors). Figure 8. Embedding of eigen v ectors of the normalized graph Laplacian on Erd˝ os-Renyi graph. The result of the embedding is sho wn in Figure 8. W e clearly observe that the first eigenv ector (whic h does not c hange sign) is separated from the rest but the remaining eigenv ectors are fairly structureless and clearly ordered with resp ect to their eigenv alue. Figure 9. Em b edding of eigen vectors recov ers X × Y for cartesian product graph. (Left) first three co ordinates of embedding, (Right) second, third, and forth co- ordinates of em b edding. 3.6. Exploratory Sp ectral Graph Theory. This technique can b e used to disco v er interesting structures in the eigenspace that are either not ob vious or even previously unkno wn. W e return to 9 the example of separable eigenfunctions: tak e K 1 to b e the adjacency matrix of an Erd˝ os-Ren yi random graph G (100 , 0 . 2), and K 2 ( x i , x j ) = e −k x i − x j k 2 /σ 2 for uniformly sampled points x i ∈ [0 , 1]. W e tak e the final k ernel K to b e the Kroneck er pro duct K = K 2 ⊗ K 1 , whic h corresp onds to the Cartesian product of Erd˝ os-Renyi grap across 10 p oin ts on a uniformly spaced grid, and build the normalized Laplacian from K . Sp ectral Theory of cartesian pro duct graphs implies that the eigenfunctions are separable across each dimension [27]. Figure 9 sho ws the landscape of the eigenfunctions as displa y ed b y the three largest eigen vectors of the eigenfunctions affinity matrix, as well as for the second, third, and forth largest eigenv ectors. The metho d clearly recov ers a very interesting structure to the eigenfunctions, with the low est frequency eigenfunctions exhibiting a different structure from the ma jority that are organized in a tw o-dimensional grid. Moreov er, this landscap e can be used to find in teresting connections b et ween eigenfunctions that are otherwise non-ob vious. T o demonstrate this, w e look at the Figure 10. φ 10 and φ 11 eigen vectors of the cartesian pro duct graph. In the landscap e, the nearest neighbor of φ 10 is φ 11 . φ 9 is significantly further aw a y in the landscap e despite b eing equally spaced in the sp ectrum. nearest neighbors of φ 10 in the landscap e, and examine their Hadamard pro duct with φ 10 . φ 11 is the nearest neighbor of φ 10 in the landscap e, and is 75 times closer to φ 10 than φ 9 is to φ 10 despite the fact that they are ab out equidistant in the sp ectrum. Figure 10 shows these eigenfunctions and their Hadamard pro duct: we disco ver that despite φ 10 and φ 11 b eing chaotic, their pro duct p erfect cuts Y in half. W e also use this technique to lo ok at the differences b et w een the normalized and unnormalized graph Laplacian for and Erd˝ os-Renyi random graph G (1000 , 0 . 2). Figure 11 sho ws the landscap e of 10 Figure 11. Landscap e of the eigenfunctions of the unnormalized graph Laplacian on a random graph, and the ` 1 norm of these ( ` 2 − normalized) eigenfunctions. the eigenfunctions of the unnormalized graph Laplacian. Notice that, in con trast to the normalized Laplacian where only φ 0 stands out, in the unnormalized Laplacian there are several low-freqency and high-frequency eigenfunctions that are clearly separated from the v ast ma jority . This w as of curiousit y to the authors, and up on further in vestigation, it w as discov ered that this deviation corresp onded to the fact that some eigenfunctions of the unnormalized Laplacian hav e sligh tly smaller ` 1 norm than the v ast ma jority . Indeed, as can b e seen in Figure 11, w e observe that the ` 1 seems to hav e a non trivial limiting distribution ov er the sp ectrum. W e are not aw are of an y results in that direction. References [1] J. Ankenmann, Geometry and Analysis of Dual Net works on Questionnaires, PhD thesis, Y ale, 2014 [2] J. Ankenmann and W. Leeb, Mixed H¨ older matrix disco very via wa velet shrink age and Calderon-Zygmund decompositions, to app ear in Appl. Comp. Harm. Anal. [3] M. Belkin and P . Niyogi, Con vergence of Laplacian eigenmaps. Adv ances in Neural Information Pro cessing Systems. 2007. [4] M. Belkin and P . Niyogi, Laplacian eigenmaps for dimensionality reduction and data represen tation. Neural computation 15.6 (2003): 1373–1396. [5] M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P . V andergheynst, Geometric deep learning: going b ey ond euclidean data, IEEE Signal Processing Magazine 34 (4), 18–42, (2017). [6] F. R. Ch ung, Spectral graph theory . CBMS Regional Conference Series in Mathematics, 92. Published for the Conference Board of the Mathematical Sciences, W ashington, DC; by the American Mathematical So ciety , Providence, RI, 1997. [7] R. Coifman and M. Gavish, Harmonic analysis of digital data bases. in:W av elets and multiscale analysis, 161–197, Appl. Numer. Harmon. Anal., Birkhuser/Springer, New Y ork, 2011. [8] R. Coifman and M. Gavish, Sampling, denoising and compression of matrices by coherent matrix organization. Appl. Comput. Harmon. Anal. 33 (2012), no. 3, 354–369. [9] R. Coifman and M. Gavish, Harmonic analysis of databases and matrices. Excursions in harmonic analysis. V olume 1, 297–310, Appl. Numer. Harmon. Anal., Birkhuser/Springer, New Y ork, 2013. [10] R. Coifman and M. Gavish, Information in tegration, organization, and numerical harmonic analysis. Mathe- matical and computational modeling, 254–271, Pure Appl. Math. (Hob oken), Wiley , Hob oken, NJ, 2015. [11] R. Coifman and S. Lafon, Diffusion maps. Applied and computational harmonic analysis 21.1 (2006): 5–30. [12] R. Coifman and W. Leeb. H¨ older-Lipsc hitz norms and their duals on spaces with semigroups, with applications to Earth Mov ers Distance. Journal of F ourier Analysis and Applications, 22(4) (2016) 910–953. [13] R. Coifman and Y. Meyer, Au dela des operateurs pseudo-differen tiels. With an English summary . Asterisque, 57. Societe Mathematique de F rance, Paris, 1978. [14] W av elets. Calderon-Zygmund and multilinear op erators. T ranslated from the 1990 and 1991 F renc h originals by David Salinger. Cambridge Studies in Advanced Mathematics, 48. Cambridge Universit y Press, 1997. [15] S. Constantin, Diffusion Harmonics and Dual Geometry on Carnot Manifolds, PhD thesis, Y ale, 2015. [16] F. Haake, Quantum signatures of chaos. With a foreword by H. Haken. Second edition. Springer Series in Synergetics. Springer-V erlag, Berlin, 2001 11 [17] D. Hammond, P . V andergheynst, R. Grib onv al, W avelets on graphs via sp ectral graph theory , Applied and Computational Harmonic Analysis 30 (2), 129–150 [18] J. Irion, N. Saito, Hierarchical graph Laplacian eigen transforms. JSIAM Lett. 6 (2014), 21–24. [19] J. Krusk al, and M. Wish, Multidimensional scaling. V ol. 11. Sage, 1978. [20] I. Jolliffe, Principal comp onent analysis. New Y ork:Springer-V erlag, 1986. [21] S. Nonnenmac her, Anatomy of quantum chaotic eigenstates. (English summary) Chaos, 193–238, Prog. Math. Phys., 66, Birkhuser/Springer, Basel, 2013. [22] K. Pearson, On lines and planes of closest fit to systems of p oints in space. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 2.11 (1901): 559–572. [23] N. Perraudin, P . V andergheynst, Stationary signal processing on graphs, IEEE T ransactions on Signal Pro- cessing 65 (13), 3462–3477, (2017). [24] M. Rudelson and R. V ershynin, Delocalization of eigenv ectors of random matrices with indep endent entries. Duke Math. J. 164 (2015), no. 13, 2507–2538. [25] N. Saito, How can w e naturally order and organize graph Laplacian eigenv ectors?, [26] L. Saul and S. Ro weis, Think globally , fit lo cally: unsup ervised learning of lo w dimensional manifolds. Journal of Machine Learning Research 4, p. 119–155 (2003). [27] A. Seary , W. Richards, Sp ectral metho ds for analyzing and visualizing net works: an introduction. 2003. [28] D. Shuman, S. Narang, P . F rossard, A. Ortega, P . V andergheynst, The emerging field of signal pro cessing on graphs: Extending high-dimensional data analysis to netw orks and other irregular domains, IEEE Signal Processing Magazine 30 (3), 83–98. [29] D. Shuman, B. Ricaud, P . V andergheynst, A windo wed graph F ourier transform, Statistical Signal Pro cessing W orkshop (SSP), 2012 IEEE, 133–136, (2012). [30] D. Shuman, M. F ara ji, P . V andergheynst, multiscale pyramid transform for graph signals, IEEE T ransactions on Signal Processing 64 (8), 2119–2134, (2016). [31] S. Steinerberger, Directional Poincare Inequalities along Mixing Flows, Arkiv f¨ or Matematik 54 , 555–569, 2016 [32] S. Steinerb erger, On the Spectral Resolution of Pro ducts of Laplacian Eigenfunctions, [33] T. T ao, Nonlinear disp ersiv e equations. Local and global analysis. CBMS Regional Conference Series in Math- ematics, 106. Published for the Conference Board of the Mathematical Sciences, W ashington, DC; by the American Mathematical Society , Providence, RI, 2006. [34] A. Zygmund, On F ourier co efficien ts and transforms of functions of t wo v ariables. Studia Math. 50 (1974), 189–201. Dep ar tment of Ma thema tics, University of California, San Diego, CA 92093, USA E-mail address : acloninger@ucsd.edu Dep ar tment of Ma thema tics, Y ale University, New Ha ven, CT 06511, USA E-mail address : stefan.steinerberger@yale.edu

Original Paper

Loading high-quality paper...

Comments & Academic Discussion

Loading comments...

Leave a Comment