I just wanted to say thanks to the dozen or so folks that have emailed corrections for typos in the chapters of our book that I’ve posted so far… And, we’ve now posted another chapter. This chapter is on “multiplicative processes”. For me, multiplicative processes are almost always the answer to the question of “where do heavy-tails come from?” So, this is a pretty important chapter for the book…and for developing intuition about when to expect heavy-tailed distributions to emerge.
You can download the new chapter here.
I look forward to your comments! None of the chapters have been properly edited yet, so I apologize in advance for the typos you’ll surely find…
As the summer winds down, I’m very happy to report that we actually made measurable progress on our book project this summer!
I’ve been working on a book “The Fundamentals of Heavy Tails” for more years than I’d like to admit now, but progress has been very slow — especially since I became department chair. Well, this summer I finally managed to put aside a day a week for book writing and it paid off! We’ve completed drafts of two chapters and have two more that are nearly complete!
Many of the other chapters just require polishing too! So, hopefully, we’ll be able to continue to release chapters during the fall…
In the meantime, I’d love to get comments on the completed chapters, so please take a look and then write to me. You can download chapters here. (We ask that you fill out a form so that we can contact you with updates as we release more chapters and complete the book.)
It seems like I’ve been waiting forever to make this post! Back in the fall, I helped to organize an Alumni College at Caltech centered around the theme of “CS+X”. Now, I’m very excited to announce that the videos from the event are up!
What is an alumni college you ask? Well, instead a homecoming game or something like that, we get alumni back to Caltech by promising a day of research talks, well really thinks like TED talks! So, Alumni College focuses on a different theme each year, and then does a day of provocative talks on that topic. This year the theme was “Disrupting Science and Engineering with computational thinking” i.e., the disruptive power of CS+X.
As I’ve written about before, we view “CS+X” as what makes Caltech’s approach to computer science so distinctive compared to other schools. We pride ourselves on inventing fields and then leaving them for others once they’re popular so that we can invent the next field. So, “seeding fields and then ceding them”…
In any case, the alumni college was a day filled with talks on CS+X from researchers at Caltech that are leading new fields… We covered CS+Astronomy, CS+Physics, CS+Biology, CS+Economics, CS+Chemistry, CS+Energy, and so on…
You can watch all of them on Youtube here. Enjoy!
Well, last week was the final workshop in the semester-long “Algorithms & Uncertainty” program that I helped to organize at the Simons institute. There are a few weeks of reading groups / seminars / etc. left, but the workshop marks the last “big” event. It’s sad to see the program wrapping up, though I’m definitely happy that I will be spending a lot less time on planes pretty soon — I’ve been flying back and forth every week!
The last workshop was on “Beyond worst-case analysis” and the organizers (Avrim Blum, Nir Ailon, Nina Balcan, Ravi Kumar, Kevin Leyton-Brown, Tim Roughgarden) did a great job of putting together a really unique program. Since there’s not “one way” to go beyond worst-case, the week included a huge array of interesting ways to get results beyond what is possible from worst-case analysis, whether it be learning commonalities of different types of instances, semi-random / smoothed analysis models, or extracting features that enable learning what algorithmic approach to use.
If you’re interested in getting a broad overview of the approaches, I highly recommend watching Avrim’s introductory talk, which gave a survey of a ton of different approaches for going “beyond worst-case” that have received attention recently. Tim also has a great course on the topic that is worth checking out…
My own contribution was the final talk of the week, which gave an overview of our work on how to use properties of prediction noise to help design better online optimization algorithms. The idea is that predictions are crucial to almost all online problems, but we don’t really understand how properties of prediction error should influence algorithm design. Over the last few years we’ve developed some models and results that allow initial insights into, e.g., how correlation structure in prediction errors changes the “optimal” algorithm form. Check out the video for more details!
One of the great new NSF programs in recent years is the introduction of the “Algorithms in the Field” program, which is a joint initiative from the CCF, CNS, and IIS divisions in CISE. It’s goal is almost a direct match with what I try to do with my research: it “encourages closer collaboration between (i) theoretical computer science researchers [..] and (ii) other computing and information researchers [..] very broadly construed”. The projects it funds are meant to push the boundaries of theoretical tools and apply them in a application domain.
Of course this is perfectly suited to what we do in RSRG at Caltech! We missed the first year of the call due to bad timing, but we submitted this year and I’m happy to say it was funded (over the summer when I wasn’t blogging)!
The project is joint with Steven Low, Venkat Chandrasekaran, and Yisong Yue and has the (somewhat generic) title “Algorithmic Challenges in Smart Grids: Control, Optimization, and Learning.”
For those who are curious, here’s the quick and dirty summary of the goal…taken directly from the proposal.
People outside of Southern California often don’t appreciate how dense (and strong) the collection of universities is in the socal region. Between Caltech, USC, UCLA, UCSD, UCSB, Irvine, Riverside, etc. There’s a lot of exciting stuff going on! And, one of the great things about the area is that there’s a strong sense of community. That is really on show at this time of the year…
We’re in the middle of workshop season in the LA area where every week or so there is a Socal X workshop. We’ve already had the Socal Control Workshop, next up is the Socal Network Economics and Game Theory (NEGT) symposium (next Friday). The week after, we have the Socal Theory Day, and the week after that we have the Socal Machine Learning day! (The last two are being hosted at Caltech this year.)
So, if you’re in the area — I’ll probably see you at least once over the next few weeks! Be sure to register for the ones you want to attend ASAP…
We always have a large number of postdocs around at Caltech (we usually have ~20+), and this year is no exception. Our application site just went live, so please help me spread the word. We have (multiple) postdoc openings in all of the following areas in CMS:
I am personally looking for postdocs as part of the first four programs. Don’t worry too much about which program is best suited for you when you apply, the backend of the application site is unified so that all faculty can easily see all the applicants. Just be sure to mark the names of the faculty that you are most interested in working with when you go through the application process.
I’m generally looking for postdocs in the areas of Network Economics, Smart Grid, and Online Algorithms, but a few areas that I’m particularly hoping to find people in are: (i) digital platforms (any flavor of research, from measurement to modeling to economic analysis), (ii) markets surrounding data, (iii) electricity markets for demand response and renewables, (iv) online optimization or, more broadly, online algorithms. So, if you’re interested in these areas please apply (and send me mail)!