Martin J. Gander (University of Geneva)
Domain Decomposition methods need in general a coarse correction to be scalable, and it seems natural to use for this purpose a coarse grid like in multigrid methods. I will show in this talk that while this indeed suffices to make the methods scalable, and thus “optimal” in traditional domain decomposition terminology, there are coarse corrections that lead to much faster two level domain decomposition methods. To explain this, I will introduce the notion of an optimal coarse space, and optimized approximations thereof. I will finally show that such coarse spaces can do much more than just make the domain decomposition method scalable: they can fix problems the underlying domain decomposition iteration has, like convergence problems for high contrast media, divergence of the iterative Additive Schwarz method, and even lead to a well posed Neumann-Neumann and associated FETI domain decomposition method in function space.