How to enjoy your decision

May 22, 2013

In a recent study, some participants had to choose a chocolate from a box holding a selection of 24 chocolates. Others picked from a box containing just six chocolates. Each box had a transparent lid. Some were told to just pick one and taste it. Others had to pick one, but close the lid again before tasting it. Then all participants were asked to rate their chocolate.

Those who put a transparent lid back on the box immediately after choosing from the 24 chocolates enjoyed their candy more than those who lingered with the lid open—even though both groups could see the chocolates not chosen!

What about the six-chocolate box? Closing the lid had no effect on chocolate rankings. The study is the Journal of Consumer Research (PDF.)

The researchers say other studies show that when we start with fewer options we don’t tend to ruminate on other choices, or even compare options. We simply like what we get.

An Unheralded Breakthrough: The Rosetta Stone of Mathematics

May 22, 2013

There is no Nobel Prize in mathematics, but in 2001 the Norwegian government established a million-dollar Abel Prize, which is widely considered as an equivalent of the Nobel for mathematicians. This year’s prize was awarded to Pierre Deligne, professor emeritus at the Institute for Advanced Study in Princeton, N.J. Today, he is honored at a ceremony held in Oslo.

Deligne’s most spectacular results are on the interface of two areas of mathematics: number theory and geometry. At first glance, the two subjects appear to be light-years apart. As the name suggests, number theory is the study of numbers, such as the familiar natural numbers (1, 2, 3, and so on) and fractions, or more exotic ones, such as the square root of two. Geometry, on the other hand, studies shapes, such as the sphere or the surface of a donut. But French mathematician André Weil had a penetrating insight that the two subjects are in fact closely related. In 1940, while Weil was imprisoned for refusing to serve in the army during World War II, he sent a letter to his sister Simone Weil, a noted philosopher, in which he articulated his vision of a mathematical Rosetta stone. Weil suggested that sentences written in the language of number theory could be translated into the language of geometry, and vice versa. “Nothing is more fertile than these illicit liaisons,” he wrote to his sister about the unexpected links he uncovered between the two subjects; “nothing gives more pleasure to the connoisseur.” And the key to his groundbreaking idea was something we encounter everyday when we look at the clock.

A theorem related to the subspace vector

May 21, 2013

Theorem

Assuming $V$ is a vector space on $F$, the intersection of all sets of subspaces of $V$ is a subspace of $V$.

A(dB+C)=d(AB)+AC

May 21, 2013

Lemma
If $A\in M_{m\times n}(F)$ and $B,C\in M_{n\times p}(F)$, then $\forall d\in F$ (where d is a scalar), we have the following.
$A(dB+C)=d(AB)+AC$

Proof
$[A(dB+C)]_{ij}=\sum_k A_{ik}(dB+C)_{kj}=\sum_k(dA_{ik}B_{kj}+A_{ik}C_{kj})$
$=d\sum_k A_{ik}B_{kj}+\sum_kA_{ik}C_{kj}=d(AB)_{ij}+(AC)_{ij}$
$=[d(AB)+AC]_{ij}. \square$

Lemma

May 21, 2013

In mathematics, a lemma (plural lemmata or lemmas) from the Greek λῆμμα (lemma, “anything which is received, such as a gift, profit, or a bribe”) is a proven proposition which is used as a stepping stone to a larger result rather than as a statement of interest by itself. There is no formal distinction between a lemma and a theorem, only one of intention.

A good stepping stone leads to many others, so some of the most powerful results in mathematics are known as lemmata, such as Bézout’s lemma, Urysohn’s lemma, Dehn’s lemma, Euclid’s lemma, Farkas’ lemma, Fatou’s lemma, Gauss’s lemma, Greendlinger’s lemma, Jordan’s lemma, Nakayama’s lemma, Poincaré’s lemma, Riesz’s lemma, Schwarz’s lemma, Itō’s lemma and Zorn’s lemma. While these results originally seemed too simple or too technical to warrant independent interest, they have turned out to be central to the theories in which they occur.

source: http://en.wikipedia.org/

Sleep(); delay++; continue; break; for(;;)for(;;)!

May 21, 2013

If you want some delay in your C program, here’s a way to do so:
To help slow down the display, you can insert a delay loop into the program. The purpose of the delay loop is merely to spin the computer’s CPU, burning up clock cycles t slow down the program at a certain point. Here’s one for example:

The C language does have a built-in delay function, so you really have no need to program a delay loop -as long as you can stand the wait!
The sleep(); function is used to pause a program for a given number of seconds. You specify the seconds to wait in sleep()’s parentheses.

Nested loops happen all the time. Most often, they happen when you’re filling in a grid or an array. Here’s what I programmed as an instance:

Two C language keywords can be used to directly control the loops. The keywords are break and continue. What continue does is to immediately repeat the loop, skipping over the remaining statements and starting the loop over with the first line (the for, while, or do, or whatever started the loop in the first place). Here comes my programmed example:

That’s what I learned during today’s 1 hour of self-teaching C language class.

Eigenvector & Eigenvalue

May 21, 2013

An eigenvector of a square matrix A is a non-zero vector v that, when multiplied by A, yields the original vector multiplied by a single number $\lambda$ that is:
$Av=\lambda v$

The number $\lambda$ is called the eigenvalue of A corresponding to v.

In this shear mapping the red arrow changes direction but the blue arrow does not. The blue arrow is an eigenvector of this shear mapping, and since its length is unchanged its eigenvalue is 1.

source: http://en.wikipedia.org/

Hermitian matrix

May 21, 2013

In mathematics, a Hermitian matrix (or self-adjoint matrix) is a square matrix with complex entries that is equal to its own conjugate transpose—that is, the element in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th row and i-th column, for all indices i and j:
$a_{ij}=\bar{a_{ji}}$

If the conjugate transpose of a matrix $A$ is denoted by $A^\dagger$, then the Hermitian property can be written concisely as
$A=A^\dagger$.

Hermitian matrices can be understood as the complex extension of real symmetric matrices.

Hermitian matrices are named after Charles Hermite, who demonstrated in 1855 that matrices of this form share a property with real symmetric matrices of having eigenvalues always real.

source: http://en.wikipedia.org/

Symmetric Matrix

May 21, 2013

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Let A be a symmetric matrix. Then:
$A=A^T$

The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if the entries are written as A = (aij), then
$a_{ij}=a_{ji}$

for all indices i and j. The following 3×3 matrix is symmetric:
$\left[ \begin{array}{ccc} 1&7&3 \\ 7&4&{-5} \\ 3&{-5}&6 \end{array} \right]$

Every diagonal matrix is symmetric, since all off-diagonal entries are zero. Similarly, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.

source: http://en.wikipedia.org/

Subspace-ness as a Theorem

May 21, 2013

Theorem
$\emptyset\ne W\subseteq V$
$W\leq V \Leftrightarrow c\alpha +\beta\in W, \forall \alpha , \beta\in W$ & $\forall c\in F$