Notes on Bloom Filter

Here are some notes after listening to a coursera lecture of Algorithm 1 by Professor Roughgarden at Stanford. The goal of a Bloom filter is to provide a space efficient hash for some data entries. The insertion and validation of an entry are very efficient. But a trade-off is that deletion is not possible and…

Some Results for Hermitian Matrix

Lemma 1: All eigenvalues of a Hermitian matrix are real. Proof: Let $latex A$ be Hermitian and $latex \lambda$ and $latex x$ be an eigenvalue and the corresponding eigenvector of $latex A$. We have $latex \lambda^* x^H x = (x^H A^H) x = x^H (A x)=\lambda x^Hx$. Thus we have $latex \lambda^*=\lambda$ as $latex x^H…

Schur Complement and Positive Definite Matrix

For a matrix $latex M = \begin{pmatrix} A &B\\C&D\end{pmatrix}$, we call $latex S\triangleq D -CA^{-1}B$ the Schur complement of $latex A$ in $latex M$. Note that $latex S$ naturally appear in block matrix inversion. Note that when $latex M$ is symmetric and $latex A$ is positive definite, $latex M$ is positive definite if and only…

Block Matrix Inversion

Here are some formula for matrix inversion. Lemma 1: For a block matrix $latex M=\begin{pmatrix}A & B \\C &D\end{pmatrix}$, $latex M^{-1}=\begin{pmatrix}(A-B D^{-1} C)^{-1}& -A^{-1}B(D-CA^{-1}B)^{-1}\\-(D-CA^{-1}B)^{-1}CA^{-1}&(D-CA^{-1}B)^{-1}\end{pmatrix}$ $latex =\begin{pmatrix}A^{-1}+A^{-1}BS^{-1}CA^{-1}& -A^{-1}BS^{-1}\\-S^{-1}CA^{-1}&S^{-1}\end{pmatrix}$, where $latex S=D-CA^{-1}B$ is basically the Schur’s complement of block $latex A$. Proof: Let $latex M^{-1}=\begin{pmatrix}E&F\\G&H\end{pmatrix}$, $latex M M^{-1}=1$ gives us $latex AE+BG=I$ $latex AF+BH=0$ $latex CE+DG=0$ $latex…

Existence of Eigenvector in Linear Operator in Complex Vector Space

Here is some quick note for the existence of eigenvector for linear operator $latex T$  in complex vector space of dimension $latex n$. Consider any non-zero vector $latex X$ in the space, we have the $latex n+1$ vectors $latex X, T[X],T^2[X],T^3[X],\cdots,T^n[X]$ to be linearly dependent. Thus, there exists $latex a_0,a_1,\cdots,a_n$ such that $latex a_0+a_1T[X]+a_2T^2[X]+\cdots+a_nT^n[X]=0$. If…

Over-justification Effect

I am watching an online course on gamification. It mentioned explicit motivation and implicit motivation and described interesting examples of over-justification effect that subjects can be demotivated by rewards. The first one is about drawing by kids. Many kids are very interested in drawing and are inherently motivated to draw. However, when they are given…

TF-IDF

Here are some notes of TF-IDF as a measure for identifying keywords in a document. To efficiently indexing documents for future search, it is important to identify the keywords in a document. For a word $latex w$ to be a good keyword, first, it should be relatively rare. This can be measured by the inverse document frequency…

Locality Sensitivity Hashing (LSH)

I am studying an online class on Web Intelligence and Big Data. Dr. Shroff described a very nice example of LSH. Below are some quick notes. The idea of LSH is essentially the same as dimension reduction. We have some data vector $latex x$ that can be in very high dimension. By processing hash data…

Computer graphic course

I accidentally found a very interesting course from Berkeley today. The title is Mesh Generation and Geometry Processing in Graphics, Engineering, and Modeling by Jonathan Shewchuk.