Sebastian Pokutta's Blog

Mathematics and related topics

Archive for the ‘Crackpot halluzinations’ Category

The Scottish Café revisited

leave a comment »

This is a try to bring together mathematicians and alike in a virtual spot to discuss mathematics and research.

The Scottish Café (PolishKawiarnia Szkocka) was the café in Lwów (now Lviv) where, in the 1930s and 1940s, mathematicians from the Lwów School collaboratively discussedresearch problems, particularly in functional analysis and topology.

Stanislaw Ulam recounts that the tables of the café had marble tops, so they could write in pencil, directly on the table, during their discussions. To keep the results from being lost, and after becoming annoyed with their writing directly on the table tops, Stefan Banach‘s wife provided the mathematicians with a large notebook, which was used for writing the problems and answers and eventually became known as the Scottish Book. The book—a collection of solved, unsolved, and even probably unsolvable problems—could be borrowed by any of the guests of the café. Solving any of the problems was rewarded with prizes, with the most difficult and challenging problems having expensive prizes (during the Great Depression and on the eve of World War II), such as a bottle of fine brandy.[1]

For problem 153, which was later recognized as being closely related to Stefan Banach’s “basis problem“, Stanislaw Mazur offered the prize of a live goose. This problem was solved only in 1972 by Per Enflo, who was presented with the live goose in a ceremony that was broadcast throughout Poland.[2]

I have been thinking about “the scottish café” for a while and i was wondering whether it is possible to have “something” like this in a virtual, decentralized fashion. Last night I accidentally participated in a “google hangout” and I thought that this might be a good platform for this.

At this point I do not have any idea what a good setup and rules etc. is. But you are most welcome to join and provide your ideas and feedback.

A proposed initial setup could be either a classic seminar talk + discussion or free discussion about a pre-specified topic. The first one might be easier in terms of getting things started whereas the second one is more in the original spirit of the café. While I personally would prefer the second one in the mid-term it might be better to get things started with the first one.

Technological platform should be google+ with its post/hangout features.
The Scottish Café will be maintained by Lars Schewe and myself at the moment. However if you are interested in helping us to get things started, please drop us a line.

Here are the links:

Advertisements

Written by Sebastian

February 3, 2012 at 12:45 pm

Fundamental principles(?) in mathematics

with 2 comments

As said already in one of my previous posts, David Goldberg and I had a nice discussion about “fundamental concepts” in mathematics. Our definition of “fundamental” was that,

  1. once seen you cannot imagine anymore having not known it beforehand and it completely changes your way of thinking and
  2. a somewhat realistic approach, i.e., when subtracted from the “world of thinking” something is missing.

So here is our preliminary list of things that we came up with – in random order – and a very brief, (totally biased) meta-description of what we mean by these terms. Of course, this list is highly subjective! For each of these “fundamental concepts” the idea is to have (about) 3 applications and try to distill the main core. There will be (probably) a separate blog post for each point on the list.

  1. Identity/Equality. Closely related to being isomorphic. The power of identity is so penetrating that I cannot even find a short explanation. Do I have to? (Aristotle’s first law of thought)
  2. Contradiction. Showing that something cannot be true as it leads to a contradiction or inconsistency. Closely related to this is the principium tertii exclusi or law of the excluded third as this is how we often use proofs by contradiction: a statement holds under various assumptions because its negation leads to a contradiction. If you do not believe in the law of the excluded thirdthen you obtain a different logic/mathematics. In particular, in these logics, usually every proof also constitutes some form of an algorithm as existence by mere contradiction when assuming non-existence is not allowed. (see also Aristotle’s second/third law of thought)
  3. Induction. Establishing a property by relying on the same property for smaller sub-objects.
  4. Recursion. Somewhat dual to induction: a larger object is defined as a function of smaller objects that have been subject to the same construction themselves.
  5. Fixpoint. The existence of a point that is invariant under a map. Equilibria in games.
  6. Symmetry. The notion of symmetry. Take a cube – rotating it does not really change the cube.
  7. Invariants. Think of the dimension of a vector space. Invariants are a powerful way to show that two things are not equal (or isomorphic).
  8. Limits. What would we do without limits? The idea of hypothetically continuing a process infinitely long. Think of the definition of a derivative.
  9. Diagonalization. One of my personal favorites. Constructing a member not being in a family by making sure it differs from all the members in the family at (at least) one position. Diagonalization often exploits self-references. An example is Cantor’s proof.
  10. Double counting. You count a family of objects in two different ways. Then the resulting “amounts” have to be identical. Typical example is the handshaking in graphs.
  11. Proof. The notion of proof is very fundamental. Once proven a statement remains true (provided constistency etc). Interestingly, it can be proven that some things cannot be proven. A good example for the latter is the existence of inaccesible cardinals which is consistent with ZFC.
  12. Randomness. Randomness is an extremely fundamental concept. One of my favorite applications is probably the probabilistic method. Think about Johnson’s {7/8}approximation algorithm for 3SAT or the PCP theorem.
  13. Algorithm. When considering a function {f: M \rightarrow N} we are often not just interested in what {f} computes but in particular howit can be computed. In this sense the algorithmic paradigm is an additional layer to the somewhat descriptive layer of classical mathematics.
  14. Exponential growth. What we were particularly thinking about was the idea that a relative improvement bounded away from {0} ensures exponentional progress. This is used regularly in different scaling algorithms such as barrier algorithms, potential reduction methods, and certain flow algorithms.
  15. Information. The idea that often a critical amount of informationis necessary to decide a property. Then fooling set like arguments can show that the information is not sufficient. Prime examples include the classical example that sorting via comparison needs at least {\Omega(n \log n)} comparisons, communication complexity, and query complexity.
  16. Function/Relation. Mapping one set to another. In particular important when the function/relation is homogeneous, i.e., when it preserves the structure.
  17. Density and approximation. The idea that a set (such as the reals) can be approximated arbitrarily well by an exponentially smaller set (such as the rationals). This exponential by polynomial approximation is also something that we are using in approximation algorithms, say, when we round the input. In this case the set of polytime solvable (rounded) instances is “dense” in the set of all instances. It can also be found in set theory when using prediction principles (such as Jensen’s diamond principle or Shelah’s Black Box) to predict functions on a stationary set by an exponentially smaller set.
  18. Implicit definitions. The concept of defining something not in an explicit manner but as a solutionto a set of contraints.
  19. Abstraction. The use of variables is so ingrained in us that we cannot even imagine to do serious mathematics without them. But abstraction is much more. It is the ability to see more clearly because we “abstract away” unnecessary details and we use “abstraction” to unify seemingly unrelated things.
  20. Existence (in the sense that Brouwer hated). One of the keywords here is probably non-constructivism and the probabilistc methods and indirect arguments are two promiment methods in this category. This was something that Brouwer despised: the idea to infer, e.g., existence of something merely because the contrary statement would lead to a contradiction (Brouwer’s school of thought denies the tertium non datur). The probabilistic method might have been fine with him. Although that is not clear at all as on a deep level we are merely trading an existential quantifier for a random one… long story…
  21. Duality. By duality we mean the wider idea of duality, i.e., for example the forall quantifier and the existential quantifier. Basically, when we talk about duality we often think about some structure describing the “space of positive statement” and a dual structure that describes the “space of negative statement”. In some sense duality is a form of a compact representation of the negation of a statement.
  22. Counting. Counting is again something that penetrates every mathematical theory. My favorite application of counting is the Pigeonhole principle.
  23. Hume’s principle (suggested by Hanno – see comments). Two quantities are the same if there exists a bijection between them. Somewhat related to “equality” however here we explicitly ask for the existence of a bijection. For example there are as many integers as their are rationals.
  24. Infinity (suggested by Hanno – see comments). The idea that something is not finite. With the notion of infinity I feel that the notion of countably infinite and uncountably infinite is closely connected. In fact the Continuum Hypothesis (CH) is such a case. It is consistent with ZFC and asserts that the first uncountably infinite cardinal is the size of the power set of the natural numbers (essentially the reals), i.e., whether \aleph_1 = 2^{\aleph_0}). However in other models of set theory \aleph_1 \neq 2^{\aleph_0} is possible, by adding, e.g., Cohen reals.

Written by Sebastian

January 2, 2012 at 8:49 pm

Let us have a securitization party

with one comment

The concept of securitization is very versatile. From Wikipedia:

Securitization is a structured finance process that distributes risk by aggregating debt instruments in a pool, then issues new securities backed by the pool. The term “Securitisation” is derived from the fact that the form of financial instruments used to obtain funds from the investors are securities. As a portfolio risk backed by amortizing cash flows – and unlike general corporate debt – the credit quality of securitized debt is non-stationary due to changes in volatility that are time- and structure-dependent. If the transaction is properly structured and the pool performs as expected, the credit risk of all tranches of structured debt improves; if improperly structured, the affected tranches will experience dramatic credit deterioration and loss. All assets can be securitized so long as they are associated with cash flow. Hence, the securities which are the outcome of Securitisation processes are termed asset-backed securities (ABS). From this perspective, Securitisation could also be defined as a financial process leading to an issue of an ABS.

The cash flows of the initial assets are paid according to seniority of the tranches in a waterfall-like structure: First the claims of the most senior tranche are satisfied and if there are remaining cash flows, the claims of the following tranche are satisfied. This continues as long as there are cash-flows left to cover claims:

Individual securities are often split into tranches, or categorized into varying degrees of subordination. Each tranche has a different level of credit protection or risk exposure than another: there is generally a senior (“A”) class of securities and one or more junior subordinated (“B,” “C,” etc.) classes that function as protective layers for the “A” class. The senior classes have first claim on the cash that the SPV receives, and the more junior classes only start receiving repayment after the more senior classes have repaid. Because of the cascading effect between classes, this arrangement is often referred to as a cash flow waterfall. In the event that the underlying asset pool becomes insufficient to make payments on the securities (e.g. when loans default within a portfolio of loan claims), the loss is absorbed first by the subordinated tranches, and the upper-level tranches remain unaffected until the losses exceed the entire amount of the subordinated tranches. The senior securities are typically AAA rated, signifying a lower risk, while the lower-credit quality subordinated classes receive a lower credit rating, signifying a higher risk.

In more mathematical terms, securitization basically works as follows: take your favorite set of random variables (for the sake of simplicity say binary ones) and consider the joint distribution of these variables (pooling). In a next step determine percentiles of the joint distribution (of default, i.e. 0) that you sell of separately (tranching). The magic happens via the law of large numbers and the central limit theorem (and variants of it): although each variable can have a high probability of default, the probability that more than, say x% of those default at the same time decreases (almost) exponentially. Thus the resulting x-percentile can have a low probability of default already for small x. That is the magic behind securitization which is called credit enhancement.

So given that this process of risk mitigation and tailoring of risks to the risk appetite of potential investors is rather versatile, why not applying the same concept to other cash flows that bear a certain risk of default and turn them into structured products 😉

(a) Rents: Landlords face the problem that the tenant’s credit quality is basically unknown. Often, a statement about the tenant’s income and liabilities should help to better estimate the risk of default. But this procedure can, at best, serve as an indicator. So why not using the same process to securitize the rent cash flows and sell the corresponding tranches back to the landlords. This would have several upsides. First of all, the landlord obtains a significantly more stable cash flow and depending on the risk appetite could even invest in the more subordinated tranches. This could potentially reduce rents as the risk premium charged by the landlord due to his/her potentially risk averse preference could be reduced to the risk neutral amount (plus some spreads, e.g., operational and structuring costs). The probability of default could be significantly easier estimated for the pooled rent cash flows as due to diversification it is well approximated by the expected value (maybe categorized into subclasses according to credit ratings). Of course, one would have to deal with problems such as adverse selection and the potentially hard task to estimate the correlation – which can have a severe impact on the value of the tranches (see my post here).

(b) Sport bets: Often these bets as random variables have a high probability of default, e.g., roughly 50% for a balanced win/loss bet). In order to reduce the risk due to diversification a rather large amount of cash has to be invested to obtain a reasonable risk profile. Again, securitizing those cash flows could create securities with more tailored risk profiles that could be of interest to people that are rather risk averse on the one hand and risk affine gamblers on the other hand.

(c) …

That is the wonderful world of structured finance 😉

Written by Sebastian

December 30, 2009 at 2:35 pm