THE OMNI SHOW

Connect with the amazing community surrounding the Omni Group’s award-winning products.

RSS
131
Feb. 21, 2024, 4 a.m.
Omni Group’s 2024 Roadmap

In this engaging episode of the Omni Show, CEO Ken Case talks about the Omni Group's innovative 2024 roadmap. Ken and Andrew highlight Omni’s commitment to embracing new technologies - like Apple Vision Pro and visionOS. They discuss the transformation of productivity apps through 3D interaction and Voice Control.

Show Notes:

The conversation reflects on Omni Group's history of pioneering in the tech space, and its vision for the future of computing as a tool to amplify human capabilities. This episode offers a fascinating glimpse into the future of productivity software, driven by forward-thinking strategies.

Some other people, places, and things mentioned in this episode:

Transcript:

Andrew J. Mason: You are listening to the Omni Show, where we connect with the amazing community surrounding the Omni Group's award-winning products. My name's Andrew J. Mason, and today we're hanging out with CEO Ken Case, discussing the 2024 Omni Group roadmap. Well welcome everybody to this episode of the Omni Show. My name is Andrew J. Mason, and today is a fun one because we get to hang out with CEO Ken Case, who's in the house and level set the year 2024 with us. Ken, thank you for joining us.

Ken Case: Oh, my pleasure. It's always fun to come and share with you what we've been working on, and what we're thinking about working on.

Andrew J. Mason: Oh man, me too. And let's get to it, because dreaming about the future is a superpower of mine. And let's start off with the roadmap blog post. So that just hit the airwaves as of this recording, about 24 hours ago. So, fresh. I'm going to dive really specific and nitpick into this mantra that you mentioned deep in the blog post about 2010, the year that our team did iPad or bust, and just this philosophy of venturing into new operating systems as quickly as possible. Many times a great track record of day and date on release. So ready with Omni software the day the iPad comes out, what is it that drives this bold kind of thinking forward that says, we want to be ready and when new technology shows up, we want to be there waiting on launch date, this forward thinking that says day and date, let's go.

Ken Case: Well, I guess it's, I don't know that bold is the right word, foolish or something, it might be a more appropriate word. But it's really, it's born out of love for being on the edge of what technology can do and trying to push that edge forward. I refer to it in the blog post as the bleeding edge because that's how one of our early clients referred to it when I first tried to invite them to... Oh, we just wrote this web browser, wouldn't you like to build a website? And at this point in time, Netscape Navigator and Internet Explorer didn't even exist yet, right? This is very early on and they're like, we love the technology that you're so excited about this, but we like being on the cutting edge of technology, not the bleeding edge. And so I've come to learn that about myself and about the company we built, that we love being on that bleeding edge. And so yes, we have been there for a lot of day one launches from, of course the iPad or [inaudible 00:02:26] as you mentioned, but more recent introductions, hardware and software and older introductions of hardware and software. [inaudible 00:02:33] 10, and going all the way back to NeXTSTEP 3.0 right at the start of our company. So it's a lot of fun to be pushing technology that way.

Andrew J. Mason: It is, it is. And it's hard not to get excited when there's new hardware involved, and some of it's Apple's design philosophy as well of just pushing the boundaries, what's possible with hardware and us saying, yeah, we want to be there to support that and play along. Talk to me about Apple Vision Pro. So OmniPlan already making strides over there, available as an Apple Vision Pro native app, and then now OmniFocus, you mentioned in the blog post available in TestFlight in Apple Vision Pro. What are some of the challenges that show up here? Because we're developing new software and it's for new platform, but there's also a new dimension here, a new way with interacting with this software and hardware. So there's a lot to consider here. Just dip us into that world and tell us about some of the things you're thinking as we're considering this.

Ken Case: Sure. So just right from the get go, the first benefit of the Fission OS operating system platform that we... It's not really about this specific iteration of hardware, the Apple Vision Pro that launched two weeks ago. It's more about actually being able to build software that is no longer constrained by screens. And what are the constraints of screens? One constraint is of course the size of the screen. However big the screen is is as big as your app can be. And so if you're on a tiny screen, your phone for example, then the app has to be limited to the size of that screen. You can get a bigger screen, you can have a big screen TV, for example, and put the app on there, but even so you're limited by that screen and where the screen can be, it's now not as portable, it's now stuck in a particular place, and the app is constrained to that size, and so on. So benefit one, and one of the reasons that we jumped on OmniPlan so quickly is we can now build an app that extends beyond the bounds of that screen. We can have a Gantt chart B, as long as you want it to be, and fill the wall if you want, or fill a few walls if you want with the Gantt charts, and have them all be interactive and not just printouts of stuff, but this is stuff that of course project planners have been printing out for ages, because they do need this big picture of, what does this whole timeline look like? And to be able to both get that whole thing in their head, but then to zoom in on the details. And you can do that to some extent on a screen because you can zoom in and out. And of course we built software that does that, and we've been working with that for years. But it's so much nicer if you can just put it up and have it be as big as you want, and not be constrained by that screen size. But then you sort of alluded to this other portion of this, which is it's not just about 2D Windows and 2D screens, it's about actually moving into that third dimension. A lot of our software as it's built today does not extend that much into the third dimension. And so goal one is to get the capabilities we already have, and expose them into this other space. And the easiest way to do that is just to get rid of the window size constraints. But even now already, like in today's test flight of OmniFocus, Provision OS, on our perspective bar we have icons of all of your perspectives and then we have badges on those that say, how many tasks are in your inbox? How many are coming due in forecast and so on. In the past when we were constrained by a 2D screen, we would sort of pretend that that was a 3D badge layered on top of this icon by adding a little shadow effect and so on. But now that we're doing this in 3D for real with Vision Os, the platform is letting us actually say, okay, well I'm going to move that badge out just a little bit. And it's now in front of, layered in front of the badge. And when you look at it, your eyes can kind of see around, it can as your head moves, you see the para, it's actually there, it's not just a pretend shadow going on. And that's just the tip of the iceberg, obviously, that's just a tiny little detail in the app. But you can imagine starting to place controls in 3D space, when people work with keyboards, they often work with them in a more horizontal layout instead of always being in a vertical layout. So you can imagine placing some of the controls for the apps down in front of you, like you might place them on a desk or on your lap. It just opens up a whole nother dimension to the way you can build and use applications. So for me, that's what's truly exciting about the Apple Vision Pro and Vision Os is not just that it's some neat, new hardware. I've been playing with this sort of hardware for decades now, but it's that we now have a platform that lets you build true productivity apps that can work together. You can copy and paste between the apps, you can drag and drop you the things that we take for granted in the Mac ecosystem. We now have an ecosystem in this 3D environment.

Andrew J. Mason: And for folks that aren't up to speed yet, talk to me about the three different ways these apps can present themselves. So when developing software for Apple Vision Pro and Vision OS, there's compatible apps, there's native apps, and then there's re-imagined apps.

Ken Case: One of the interesting things about Vision Os is that we have different layers of how much we can really take advantage of it. The apps that we see there on day one, many of them are just compatible apps. Even some of Apple's apps are compatible apps where they took the iPad app and they stuck it in a little folder called Compatible Apps. And if you open that folder, you can now pull out an iPad app and you can place the iPad app anywhere you want in the world around you. So it's now better than an iPad in that you're not constrained by the position of an iPad, and you can have more of these open and so on. But still, it's not a super exciting app yet because it's just an iPad app, and the way you interact with it is you can either reach out and touch it like you would with an iPad, or you look at it and you pinch and interact with it that way. But I think it's more interesting, and of course what Apple's promoting, are when apps go native and that means you're actually taking advantage of the operating system's new APIs, the new Vision OS APIs. So there you might have multiple windows open in different places, you might have... Come to think, I think you can do that with an iPad app as well, because iPad apps already have multiple spaces. But your controls are no longer limited to the windows. You can have 3D effects like we just talked about with OmniFocus perspective badges and so on. And the interactions are really designed to be based around this model of, you look at something and it responds unless you know what is interactive. That changes the way you design an app, when your interaction is primarily through your eyes, your hands and your voice. Then finally, I would say we have not really seen very many of these, maybe a few of these are already out there, where an app is not just an app that you would find on another platform. It's been re-imagined in a way that you couldn't do outside of this sort of 3D operating system. And so for example, there's an app that somebody has a drum kit and if you're going to try to play the drums, that's something that doesn't work so well on a flat screen, but it works great if you can actually reach your hands out and do things. We see this with games as well, of course. I've seen a painting app already where they put the painting controls sort of out in front like a drum kit, and you might... A pure stylus in that and then paint, and so on. So there are opportunities to really reimagine how our apps work, and we've been thinking about those. We haven't spent a bunch of time on that, because our first goal is to get everything at least to the native point yet. And so right now with OmniPlan, we had a native app that shipped on day one for the Apple Vision Pro. And then with OmniFocus here two weeks later, we now have OmniFocus in test flight as a native app, and we have a ways to go yet to get OmniGraffle and Omni Outlier there as well. But we're looking forward to getting them all there.

Andrew J. Mason: Different folks have different on-ramps to technology and different ways that they look at it, and maybe some other areas that they haven't considered. And when I zoom out and think about the overall picture of where tech is headed, to me this is such a statement, because one of the first iterations of the promise of all of this technology as it's moving forward is we're taking a lot of different things from different places, and putting them in one single place. So millions of songs in your pocket with the iPad, or millions of books in one location with software that reads. So it gives you the ability to take a lot of stuff that usually is available in different spots and put it in one space. For this, now we can see with spatial computing that it adds that layer of, you can now put everything everywhere. As you're starting to look at OmniFocus and we're looking at that test flight for Vision OS, talk to me about some of the feedback mechanisms. How do you look at moving forward with adding new features in a product like this?

Ken Case: Sure. Well, I should note that this OmniFocus work that we have, that we've done. We did a little bit of work right after WWDC last year, and I think we even talked about how we wanted to quickly make sure that our apps at least would build on Vision OS, and that we could use them in the simulator. But then we knew that we needed to focus on actually shipping OmniFocus four. And so we set it aside and the team, once hits down on the shipping platforms, and we didn't pick it up until after 4.0 Shipped. And of course 4.0 Shipped in December, right before the holidays, and so then people also had their holiday breaks and came back in January ready to start looking at OmniFocus on Vision OS. And so what we have now is basically six weeks later from when we got seriously going on that work. And in that same time since OmniFocus shipped, we shipped 4.0 mid-December, we've also shipped 4.0.1, 4.0.2, 4.0.3, 4.0.4, 4.0.5. So it's not like they've been able to just look at this full time and ignore the shipping apps. We've been busy in a lot of ways. But yeah, as we kind of look ahead with 4.0.1, we will add a division OS support, and that's its biggest contribution to what's going on with OmniFocus. But then we'll have 4.02, and 4.0.3 4... down the road, just like we've done with OmniFocus 3 and it's 3X cycles ,where we introduced all sorts of new features over the months and years between the time that 3.0 shipped and when we finally shipped 4.0. So some of the earliest things that we have planned are things like new perspective rules that let you filter things based on dates more effectively. So you can say things like, I want things that are going to be due within the next five days, or maybe things that I completed within the last five days. Those sorts of simple rules that people have asked us for a while and over the time they've had to come up with workarounds like, oh, okay, well I'll set my due soon period to five days, and then I can ask for anything that's too soon. But then you can't have different perspectives that show you different timeframes. So that's just one example of something that we'll be introducing soon that pretty excited to be able to do. But there are plenty of other things, and I'm sure people have their favorites, I would love to hear from folks. Now that we have this great foundation to build on with 4.0 having shipped, having unified the code base so that when we work on a feature it works on all of the platforms, and now including Vision OS. That that just makes it so much more productive and now super eager to get back into, okay, what are the features in the apps domain that people want to work on? Things like Kanban views and so on. Imagine Vision OS Kanban view.

Andrew J. Mason: Oh Ken, I always wish you didn't tell me about that, now I immediately want that feature. Hey, that's a great sign that you're onto something there for sure. Let's move over to voice control, because not only in the Vision OS environment can you control things with your eye movements and your fingers and gestures and stuff, but also voice is a slice of that, and we're interested.

Ken Case: Yeah, absolutely. So I think if you really want to get a lot done on the Vision OS platform, you really do want a keyboard, and probably even a track pad. You're not, rather than just relying on looking at something and then pinching with your fingers. So that's not nearly as good for text input as using a keyboard for text editing. But there's also this, I talk about the eyes and the hands, we also have your voice. And the voice isn't always appropriate, like sometimes you're in a setting where you don't want to be talking, or maybe your voice is just tired of talking, and I don't know that it's something that you would use all the time. But it's been an interesting input mechanism that has been around for quite a long time. I remember playing with it on my NeXT, back in the early 90s, and setting up voice commands that would do various things. And of course we've seen it on the Mac platform since the 90s as well, it's been a technology that's been there. We added voice support, voice input support too on our web browser, when we brought it to macOS 10. So you could sit there and have a coffee in one hand and be talking with the other hand and say, okay, we'll follow the link named whatever, because it would scan the links on the page, add voice commands for each of them. So it's something we have played around with for a long time, I guess. And it's something that as Sal has been working with us for several years now, and he's also been working with this for a long time, both when he was at Apple and then since then he started working with us. So when we built on the automation, which lets you extend our apps with your own commands, it was a natural idea to him to say, well, why don't you make it so that we can also extend those commands in ways that let us do voice input to run those commands, and then use voice output from those commands to talk back to you and give you the... If you're giving vocal input, it's often nice to hear verbal confirmations that come through your audio senses. And so you might say something and then it says something back. And so we added that low level support to apps I guess about two years ago now. And then Sal just started going to town and built over a hundred plugins that were voice control-activated, and that you could use to extend our apps in different ways.

Andrew J. Mason: That is so cool, and it's very honoring from an accessibility standpoint. I mean there's so many different forms of input and so many different ways that people want or need to interact with their technology or want or need to get information out of it. And honestly, alternate forms of input and output might be something that is easily missed if you're not used to using it. But once you interact with technology in that way, it's really hard to miss.

Ken Case: Yeah, and it's a great fit for the Apple Vision Pro, for the Vision OS operating system because again, you might not have a keyboard with you and you do want to use your voice to say something or run a command, and now you can just say, "Hey, do this," and it does that.

Andrew J. Mason: Do you ever just pause and look over your shoulder and think, man, 2023, it was a jam packed year. Not that we're busy sitting there thinking about the past or anything, but my gosh, all the different things that happened and showed up, not the least of which actually, OmniFocus 4 we just found out was editor's choice in the app store. Congratulations on that team, go team.

Ken Case: Thank you. We just stumbled across that yesterday afternoon ourselves when we were looking at our app store page and like, wait a sec, when did that get added?

Andrew J. Mason: It's got to be hard when you're moving at such a fast pace to look back and level set. That's why I love these roadmap posts because it gives that delineation, that mile marker to say, where have we gone and what's happening for us for the future? What do you feel when you look toward 2024? How do you go about thinking about what's to come in the year ahead?

Ken Case: So I would say that most of the time our eyes are not looking back. They're mostly looking ahead to what comes next, what do we need to do next? In fact, I feel kind of bad sometimes because we often jump so quickly from what we just accomplished to the next thing, that we don't take a moment to pause and sort of celebrate what we just accomplished. And so in some ways the roadmap is an opportunity to do that, to look back at the year gone by and think, oh yeah, what is it that we've done? And to share that with the team, as well as, of course with all of you. But yeah, for the most part I think our eyes are just on the future. Okay, well what next? We know the technology isn't necessarily where we want it to all be yet, or that our apps are not where we want them to be yet, and what can we do to help bring them to where we want them to be, whatever those actions are. And so we just have to take it a step at a time. Sometimes we have a long game in mind around, okay, well in order to do that, we need to do this first. We need to do that before we can do that. And things like making our apps universal, making the code bases universal and shared cross-platform code with Swift UI. Swift UI is not a user-facing feature. We don't expect end users to care whether we we're written in Swift UI or C++ or JavaScript, or whatever. But it is an important feature for our development team to know that, okay, I'm using... When I write this code on the Mac, I can also use it on Vision OS or I can use it on my watch, or I can use it on my phone or my iPad, that they all share this same language, and I'm not having to rewrite it again and again every time that we add a new feature. And so that ends up being really a force multiplier for our engineering efforts because now as we add new features, we can say, okay, and look, it's available wherever it makes sense. I mean maybe a Kanban view is not going to work very well on the watch because the watch just doesn't have space for it, but if it makes sense, we could put it everywhere else.

Andrew J. Mason: I know it's not in our list of questions, so feel free to bypass if you want to, but take a guess at this one. When I think about the fact that there's thousands of people who run their productivity on software that we've helped create, that's a very rewarding, very satisfying thought, but I also know it's not why we do it to begin with. Because this is something that you've wanted to see exist in the world and you're like, well, I'm going to do it. So on some level, it's a slice of your life's work but, what is it in you that drives you in that direction? Do you have any thoughts on... What can you point to say, this is why I do this?

Ken Case: I don't know either. I do know that the passion is there that, I don't necessarily know that I'm the best person to do that either. There are a lot of people who have been working on these problems for decades. And there are a lot of people who have just started to work on them for the first time. And it's always exciting to see new developers come up with new apps, and what people are thinking of what they're building. That's one of the other exciting things about checking out Vision OS right now is just seeing the new apps appearing in the app store every day, as people come up with new ideas of how they're going to help meet needs that people have in the real world. Obviously right now the market for the Apple Vision Pro is limited because it's so expensive, because it's only available in the US, and so on. We're not trying to build for the market that exists today. We're trying to build for a platform that we imagine seeing down the road in a decade. If we were to ask somebody 50 years ago what computing was going to be like and how you would interact with computers, maybe they would imagine screens in your pocket, but maybe they would imagine projections around the world instead. That why would you be limited to these little flat things that you carry around and put in different places, instead of having an interface that is no longer flat and can go anywhere you want? And so it may take us a while to get there for real with the hardware. The current hardware is the first hardware, I think, that achieves it in a practical way where you can actually start to use it and prototype it and build these apps that I'm talking about building, but this is not where it's going to end. This is the first iteration of, too many people have said the iPhone, so I don't want to say the iPhone because that was so enormously successful. Can anyone predict that this is going to be the same? Not necessarily. But even the Apple Watch, every... The Apple II, the Macintosh, all of these products, they all take time, and they require people pushing on it from every angle. And we want to push on it from the productivity angle and make sure that that part of the story doesn't get left out.

Andrew J. Mason: Yeah, and not to belabor, but it really does make me think of the Steve Jobs, Walt Disney, keep moving forward, just keep swimming mentality that just says, it's ahead. That's where progress is, it's ahead. And anytime that new hardware shows up, we want to do our best to steward it by pushing the boundaries of that hardware with the software we create. And the productivity factor as well. I mean, it's really gratifying to know that we're creating tools that help people be as productive as possible, and we want to do our best to continue to do that. And thank you, Ken and everybody, honestly, just for... For me, this really is part of that backpack process to say, "Okay, yay team, job well done. It's awesome, we'll keep moving forward. But just to have the waypoint and the marker to say, this is incredible what's gone on in the year past, and where we're looking ahead, it's really cool to be able to do this, Ken, and thank you.

Ken Case: Well, thank you. I'm grateful that you've joined us on this. It's a lot of fun, and any customers who are listening, I'm grateful for all of you that have helped us help make this possible. I probably talked before about Steve Jobs' analogy of the bicycle of the mind, but I enjoy it so much, I think I'm just going to talk about it again, if that's all right. When back in the 90s, Steve was asked a little bit about his passion for computing and why he cared about productivity apps. And he talked about the computer being a bicycle for the mind. And where that analogy came from was that he was talking about an episode, I don't remember, on National Geographic or something, where they were looking at the most efficient animals in terms of how much energy they spent to travel a mile of distance, for example. And so humankind was just kind of average in that mix. It was not anything to write home about. There were birds that can coast, and certainly do that much more efficiently than we can, and so on. But if you take a human and you pair them with a bicycle, that bicycle can take them there with much less energy than any of the animals that were on the chart. Suddenly, humankind is much, much better in that graph. And so he said for him, that was what a computer was. It was a bicycle for the mind, something that could let us apply a little bit of energy to the problems that we were working on in our mental space, and get a lot more result. And so we love being a team that builds tools that are bicycles for the mind.

Andrew J. Mason: Ken, I'm going to leave it right there. Thank you so much.

Ken Case: All right, thank you.

Andrew J. Mason: Hey, and thank all of you for listening today too. You can find us on Mastodon at theomnishow@omnigroup.com. You can also find out everything that's happening with the Omni group at omnigroup.com/blog.