Sign Up

Abstract: Big computer models help us understand and design the complex physical world we inhabit. But the computer models themselves can be overwhelmingly complex, consisting of many submodels from different engineering disciplines. Here's one recipe to simplify things: (1) treat the computer model like an experimental lab, (2) run experiments to generate data, and (3) use the data to fit statistical learning models that suggest simplified physical properties and relationships. Suppose you want to really be sure that the properties suggested by the data are also in the complex computer model. Trouble is: as you consider more physical parameters affecting the physical properties, the work---measured in the number of experiments---required to really be sure grows exponentially: the dreaded dimensional curse. Maybe you don't need all those parameters to begin with. If you could avoid considering some of them, then the work needed to be sure of apparent insight may fall to feasible. I'll show you a paradigm for parameter reduction with strategies for computing. Then I'll demonstrate the strategies with applications from computational science and engineering.
 
Toward articulating a long-term research vision, I'll revisit the general 1,2,3 recipe and point out a troubling contradiction in its construction. The contradiction has consequences for today's trend of combining machine learning with computational science models---particularly with respect to interpretability. Articulating the philosophical contradiction ought to suggest better strategies for reconciling and connecting data science and computational science to make better science. That's what I hope to show you.

Bio: Paul Constantine is an assistant professor in the Department of Computer Science at University of Colorado Boulder. He completed his PhD in Computational and Mathematical Engineering at Stanford University and was awarded the Von Neumann Postdoctoral Fellowship at Sandia National Labs. Before coming to Boulder, Paul was the Ben L. Fryrear Assistant Professor of Applied Mathematics and Statistics at Colorado School of Mines. Paul's work bridges computational science and data science. His original contributions include active subspaces for parameter reduction.

https://cuboulder.zoom.us/j/190280621

User Activity

No recent activity

Abstract: Big computer models help us understand and design the complex physical world we inhabit. But the computer models themselves can be overwhelmingly complex, consisting of many submodels from different engineering disciplines. Here's one recipe to simplify things: (1) treat the computer model like an experimental lab, (2) run experiments to generate data, and (3) use the data to fit statistical learning models that suggest simplified physical properties and relationships. Suppose you want to really be sure that the properties suggested by the data are also in the complex computer model. Trouble is: as you consider more physical parameters affecting the physical properties, the work---measured in the number of experiments---required to really be sure grows exponentially: the dreaded dimensional curse. Maybe you don't need all those parameters to begin with. If you could avoid considering some of them, then the work needed to be sure of apparent insight may fall to feasible. I'll show you a paradigm for parameter reduction with strategies for computing. Then I'll demonstrate the strategies with applications from computational science and engineering.
 
Toward articulating a long-term research vision, I'll revisit the general 1,2,3 recipe and point out a troubling contradiction in its construction. The contradiction has consequences for today's trend of combining machine learning with computational science models---particularly with respect to interpretability. Articulating the philosophical contradiction ought to suggest better strategies for reconciling and connecting data science and computational science to make better science. That's what I hope to show you.

Bio: Paul Constantine is an assistant professor in the Department of Computer Science at University of Colorado Boulder. He completed his PhD in Computational and Mathematical Engineering at Stanford University and was awarded the Von Neumann Postdoctoral Fellowship at Sandia National Labs. Before coming to Boulder, Paul was the Ben L. Fryrear Assistant Professor of Applied Mathematics and Statistics at Colorado School of Mines. Paul's work bridges computational science and data science. His original contributions include active subspaces for parameter reduction.

https://cuboulder.zoom.us/j/190280621

User Activity

No recent activity