Welcome to episode 351 of the Microsoft Cloud IT Pro podcast recorded live on August 30th, 2023. This is a show about Microsoft 365 and Azure from the perspective of it pros and end users where we discuss the topic where recent news and how it relates to you. It's hurricane time or Python time, or maybe the hurricane is bringing pythons. This week, Scott and Ben catch up on some of the latest news around power automate Microsoft Enterra and Microsoft Excel while they look out their windows.
Watching Hurricane Adalia whipping the tree branches around. Was a dark and stormy morning. Yeah, we're back to hurricane season. Yeah. Which means we get to sit here and record and watch the trees gently swaying in the breeze. Yeah. I keep turning around like this. It's 'cause it's gusting. We're like, whoosh. You're looking whoosh. See, I have mine right out here. Coming through the backyard, so we'll see how it goes. I don't have to completely look that way.
I just look out my window here right now. It's really still. But yes, I'm the same way watching the gusts. Every once in a while the trees start madly dancing in the wind. Oh, look at you. You, you've been playing with chat G b T and poetry haven't you? Yeah, no, I should . I should have Ja chat. G p t write a poetry script for the podcast. Yeah, I dunno. We have not, I don't know about you. We haven't gotten much rain.
It's supposed to be windy later today as it gets a little closer to us, we're supposed to get 60 mile an hour winds, but really no rain, which I'm disappointed in because I'm cheap and I don't wanna pay for water to fill up the pool and we're not getting enough rain to fill the pool back up. Yeah, well. It's a hard life, but. Yeah. Yeah. We'll see how this one's going.
Folks on the Gulf Coast are getting it like they're, we're we're into, we're into landfall time right now, so I I was watching the news before we hopped on, but once again Jacksonville gets cleared out so. You know who didn't though? This was interesting. I was talking to somebody and they were like, I don't know if this has ever happened before, that Valdosta, Georgia has a hurricane warning because of where it was coming in through the armpit of Florida.
There there's like very little land between the ocean and Georgia. So it was like just blowing right through Florida and it was still gonna be a hurricane when it got up into Georgia. Yeah. Yeah. It, it, it happens. It was something climate change is real Y'all like maybe when your insurance company has started upping the prices, we should have all gotten the gotten the idea. We're. Just dropping you because they didn't want to insure your zip code anymore. There. There's that too.
Not that ever happened to me and I'm bitter about it. Yeah, I need coffee. It was a long ways away. . Okay then. All right, you got coffee? I've got coffee. Let's go ahead. You've. Got news. Get into it. Where do you wanna start today? Where do you wanna I I don't have my articles up. Just a minute. Browser. I was getting my coffee. Where should we start today? I can guide you in or you go ahead and pick one. No, you're all set. It's your show. Pick one. Oh, let's pick one.
Let's pick the integration of power automate telemetry data with Azure application insights. You. Went the other way. I didn't think you were gonna go that. Way. You didn't think I was gonna go this way? This one's your exciting one, huh? I don't know if this was exciting. This one's a little older. How's that? This one is like a week old at this point in time. Yeah. Sorry I messed y'all up. I'm jumping all over the place today. I don't think there's not a ton with this one.
This one an announcement back last Thursday on August 24th that you can now go into your power automate and go set up certain data to come out of power automate into application insights. So this can some things like cloud or cloud flow runs and power automate. I still think they need to call 'em automation since they name changed the name of it.
Because anyways triggers action level data from an environment to app Insights and it's really just like a click, click Next you go through export to application insights, select what you wanna export and then power automate. And it's really those three categories and it just shoves it all into insights for you into log analytics so you can go look through it. You can also take multiple environments into a single instance. So I know I have some companies that have multiple instances.
It'll be interesting to see how much data eventually gets added. Like one of my peeves with Power Automate is I will get, I've built some very complex flows for certain customers and when something goes wrong or something errors out, it says failed in the run history.
But if you have these flows that are running like a hundred, 200, 300 times a day and they only load 10 instances or 20, I can't remember how many it is, but it's like the last 20 runs in the UI and then you have to scroll and it's like the whole continuous scroll. Then you wait for the next 20 to load and then you scroll and wait for the next 20 to load. It's not very easy to dive through.
Power automate runs to view things like which actions are flowing and when they flow and try to dive through the details of what actually happened or even going and looking at a historical one that may be completed but didn't do what it was supposed to because that happens too where it doesn't actually air out but it doesn't work correctly.
Having some additional data to be able to pull out of these power automate flows and really analyze the data and analyze these different actions and what happened with them would be very helpful. I don't know that this is there yet, but hopefully this is a step in ingesting those logs into someplace where it's a little bit easier to query them, dig through them, see what's going on within your power automate environment. Yeah. So nifty a little bit
maybe limiting in its first iteration, but that's okay. It's in preview. I should have called that out just to reiterate like preview not for production, blah blah blah. So maybe it gets better. So there's probably some things for folks to watch out here. Four if they've never done done App insights before. So App Insights is like you mentioned, it's its own little service that sits on its side and takes a pretty heavy dependency on yield log analytics and log analytics workspaces.
I think Aveta like a wrapper layer on top of log analytics that gives you ultimately a way to interact with the data in the background in your app. Insights in the same way that you would with the log analytics workspace. It's our old friend Custo, again sitting there ready to go. But I, I think a couple of cool things that happen along the way here, like even with the limitations, like you mentioned, there's only a couple different types of telemetry you can catch.
It's not like you can put like a custom event in your power app that's then writing to that login analytics workspace. Unless you wanna wire that all up yourself. There's no built in nicety for that. But once it's all there, now you've got all of the requests for your cloud flow runs and then all of your cloud flow triggers and cloud flow actions that you mentioned. Uh, they also land in a dedicated table as well for dependencies.
So once all this stuff is sitting there and it's ready to go, then you can start getting nifty with it because there's other things that you can start to do. What if you wanted to create a an an alert through App Insights and Azure Monitor that goes and looks at a particular flow and actually alerts any time it fails, hey now you can just spin that up and do it. So that's all nicety and and ready to go. You mentioned multiple environments.
There is a note in the documentation, not in the blog post shockingly that calls out when this feature goes from preview to ga, it is only going to be available for managed environments. So you're going to have premium usage rights for users on those power apps and I don't know why they're allowing you to do it in without premium use rights and without managed environments in the preview. That seems. Yeah, that's odd. It's like a bait and switch ish. A little bit.
I call it out because some folks don't always go and read the documentation before they spin this stuff up. Yeah, like me, were you referring to me there Scott . But yes, I agree to give it to. I was ringing to us there to. To give it to everybody and then pull it away. That one is a little odd. We should probably go talk about premium or managed environments at some point in time too. I don't know that we've talked about those much add.
Add it to your list. Yeah. Your other thing about K Q L too is nice 'cause I also have some flows that fail once because of illegitimate failure. Uh, so whatever triggered it didn't get passed in, it failed. So maybe even my alerts for a single failure are different versus maybe I wanna go see if this fails like 10 times in a row or fails 10 times in a certain time period.
I want to be alerted of multiple sequential failures or multiple failure failures in a certain timeframe that maybe indicate it wasn't something with just that whatever happened to trigger that flow that particular time, but something more indicative of a more significant issue with my flow. Like a service account getting disabled that now causes all my connections to stop working or something like that. Never happens. That no never.
Or it was tied to a user instead of a service account and said user left the company and their account get disabled. That never happens either. . Yeah, all those little things. We might have to come back and revisit this one once it GA's and see how things change with it along the way. But I'm a big fan of App Insights so I would take having this data in there over not having it in there.
The other thing that you have to watch out for when you spin these up is it can create an app Insights resource for you or you can go down the path of saying I already have one. So in your case, maybe you have a couple apps and they're even across environments or things like that and you wanna tie 'em all together in the same app, insights, workspace, you can do all that. You just have to be mindful in the setup that you're going down that path and
not letting Power Automate make some decisions for you. Ooh, app insights for everybody kind of thing. Absolutely. Another. Governance thing to watch out for. Yes, there's a lot of governance stuff with the power platform as a whole. It's a, it's a unique service within the platform from a governance perspective. Platform is a unique service. There you go. Good way to sum it up. What's next Scott? Which one are you gonna jump to? What.
I'm gonna keep us in? Platforms. Platforms. Did you see, I know you're always excited about this stuff as as our Mac OSS reporter platform, SS s o for Mac OSS is coming to intra ID tenants near you as a deployable, extensible agent on Mac OS clients. I think this one's kind of cool. This one is cool and you're right. I am excited for this one because up until now we did have the Microsoft Enterprise S SS O plugin for Apple devices.
So this essentially allowed you to set up S ss o once you were logged in so that you could SS s o into various applications on your Mac OSS device. Now it is coming to Mac OSS as a whole. So this is taking that SS s O plugin bringing it up to the next level where now you can have your platform credentials for Mac oss Go Passwordless using Touch id unlocking the devices and be signed into intra ID under the hood.
So essentially now you go log into your macros device and you're signing into intra ID as well using and they go into all the, it's the device bound, cryptographic key phishing resistant credentials based on the windows Hello tech and some of the Apple hardware that's already there. So this is gonna be cool. It's not quite Azure AD joining your device as a whole, but it's bringing it closer to that where now you're again signing into your device, signing into enter ID at the same time.
So this is enabling all the SS s o for all your applications. You can also with this now synchronize your local account password and the enterra ID password. So your user account password on Mac OSS is also your enterra ID password. It doesn't sound like it's necessarily the same account or even the same user id, but at least the passwords on the backend are going to be the same. So you don't have to remember two passwords if you even use a password for
anything anymore. Most of mine is all touch, ID watch, ID passwordless, all of that type of stuff. This is an upcoming public preview of platform SS s O for Mac oss and it'll work with Intune. It is coming to other M D M providers soon. I would say the timing for this is interesting.
So it does say your Mac OSS requirements, I saw it in here somewhere, venture and hire, but it was also announced the same day that Apple announced their September event for Mac, which is usually also when the next version of MAC OSS comes out. So it'll be interesting to see.
At first I thought maybe this was going to be a Sonoma specific functionality that Apple did something, but with the requirement of enterra, maybe there's some updates coming to ventra as well or just Microsoft finally was able to incorporate this into enterra a nice improvement for MAC users that are using intra id formerly known as Azure ad. Yeah. So the way I ran into this one, uh, you probably ran into it through the R S SS feeds that that we kinda automate
and bring in. I ran into this one over on, I think it was Mastodon. Somebody was chatting about it and it it was one of the PMs in Azure AD and the way he framed it is like, hey, we're finally doing this. You can finally bring your C T o, your security leads and your devs on board . I was like, that is so true. Like I, I remember I I worked for for a logistics company here in Jacksonville and our C E O was the only user in the organization with a Mac . That was it.
He liked a Mac the rest of us like we bought Lenovo's stuff like it was free candy but the C E O wanted his Mac and that was like the pain point for all of us to live with. And then I was thinking about it where I was like, oh and our chief security officer used to be on a Mac too. Yeah that that made life more difficult back then like it's so true. Bring folks into the fold and get 'em where they need to be. I'd be interested in seeing how this one works there.
There's other components of Intune and the whole M D M stack on a Mac that maybe don't integrate the nicest. So one area for me would be say you impose like FileVault restrictions, which is basically like encryption on the MAC side. Yep. You might have BitLocker on Windows kinds of things. Does that extend all the way down to FileVault? Do you still need a local password for FileVault? What does that look like? How does it come through?
'cause when you do things like BitLock or like you get your recovery keys stored Azure ad your help desk can work with you. Some of those constructs don't necessarily exist. Like they're not the same on Mac OSS Ventura to all the way up to the upcoming Sonoma. We shall, we shall see. Yeah.
It's. Always interesting when stuff like this comes out because I think sometimes from a marketing public perception side of things, it's like Microsoft and Apple are bid rivals and they can't stand each other in all of this. But then you see stuff like this and it's, you wonder how much, and I suspect there's a lot more collaboration between the two companies, especially when it comes to stuff like this or I know even some of the stuff with OneDrive files on demand.
I think one of the sessions I saw there was some partnership in working between 'EM but from getting their systems to work together. When it comes to things like the SS s o when certain features that Microsoft maybe rolls out, I get the feeling maybe there's a little bit more partnership together behind the scenes than maybe public perception always gives to .
Give you some of this stuff 'cause it is important. Like you said, you want your C-level guys to guys girls to be brought into the fold and your devs and your security people. Oh you also forgot Scott. Marketing Graphics, multimedia. Those people are also usually all on their Macs. They are indeed right here. Do you feel overwhelmed by trying to manage your Office 365 environment? Are you facing unexpected issues that disrupt your company's productivity?
Intelligent is here to help much like you take your car to the mechanic that has specialized knowledge on how to best keep your car running Intelligent helps you with your Microsoft cloud environment because that's their expertise. Intelligent keeps up with the latest updates on the Microsoft cloud to help keep your business running smoothly and ahead of the curve.
Whether you are a small organization with just a few users up to an organization of several thousand employees they want to partner with you to implement and administer your Microsoft Cloud technology, visit them at intelligent.com/podcast. That's I N T E L L I G I k.com/podcast for more information or to schedule a 30 minute call to get started with them today. Remember intelligent focuses on the Microsoft cloud so you can focus on your business.
Alright, so yeah platform SS s O for Mac, OSS nifty stuff. Why don't we keep talking about intra ID since we're over here. There's another interesting announcement that came out this week-ish. Uh this week. Yeah. Any, anyway, it'll have come out by some week by the time somebody listens to this. But that is that there is the ability now to do a p I driven provisioning within your intra ID tenant.
So rather than having maybe some outside service that does some wonky synchronization or something else that comes in along the way, you can just go ahead and actually take your system of record. I don't know, let's say I've got like SS A P or SuccessFactors or something like that.
I can take the data out of there and as long as I can extract it and get it out, you can then push those changes like when they occur in that system of record over to this new A P i which sits within the uh, entra provisioning service and then that goes ahead and pushes that data out to where it needs to be.
And the cool thing is it's not only like cloud provisioning 'cause now you've got, because we we've got the whole like intro cloud sync thing and and you do have hybrid like you mentioned putting all those things together. So now you can push into a single A p I endpoint and then say it's something that needs to affect change in your on-premises active directory. Great.
That change can flow straight down from the provisioning a P I and the provisioning service down into what's yet another local agent running behind your firewall and ready to go in the form of the provisioning agent.
And that can do things like push users and push changes into your on-premises active directory and then if you have synchronization configured between that and your intra ID tenant cloud sync kicks in, connect sync, click whatever you happen to be doing a along the way to get you there. So I think that's a super powerful one. Kind of seemed to go out under the radar. I didn't see a bunch of fanfare about it but I think it's super powerful, super cool and a long time coming. Yes.
I agree and I have some customers that this might be beneficial for. Like I've done some weird stuff, I'm not gonna lie, it gets a little hacky at times but like one customer. That's janky, janky. Janky, it's not hacky, it's janky. What about convoluted? Convoluted is a good one too.
I've done things for customers like their system again, they have a different system of record that they need everything to go into Azure AD four and it's things like C SS V files getting emailed out of this system because they can act do like regular export email reports and then we pull the said C S V out of the email attachment, dump it into blob storage, which triggers another flow that runs a runbook to go parse the C SS V,
find all the changes and synchronize them up to Azure Active directory. Just being able to tie straight into an A P I if you can go straight from that system of record instead of all the secret sauce in between them, this would be very nice And I was talking to another customer that was in the same type of boat, different system of record, wanted to just shove everything straight into N I D and this is going to be a good solution.
So this is one that is absolutely on my list and I'm excited to play about play with because I do have some customers in this scenario where we might be able to simplify some things for them. One of the other things to call out in here is you can give it a try and the a p i driven provisioning, this is another one of those preview features. Don't use it in production, yada yada yada. You can start using it if you have Microsoft enterra ID P one.
So premium P one license for Azure ad enterra but then it also says licensing's terms will be released at general availability. So if you are a brave soul that wants to try this in preview, it doesn't sound like they are necessarily committed to this only being or remaining just a P one feature that this similar to the log analytics one we talked about or the app insights could change once
it hits GA from that licensing perspective and what's required. Again, I know they want people to try it, right? That would be my guess as to why they're doing this is we have this license now but we don't know what it's gonna be when it comes out in general availability. And they've done that before too with other stuff or it's, it's free now. It's not gonna be free once it comes out.
I wish they would just release the licensing terms right away so that people would know is this even worth playing with because I could play with it but there's no way I'm gonna be able to upgrade everybody to premium P two or pay an extra $5 per user a month to have a p I driven provisioning because it's just not feasible for their budget. So I don't know. It'd be nice to see what the licensing terms are gonna be once it hits general
availability. Even if you have different ones in preview. In preview, you can use it here once it, it's a general availability, it's going to be this. It just helps from a planning perspective in my opinion. Yeah. So there's some other interesting things I think about the provisioning service as a service kind of thing and the way it's been implemented.
So lots of the identity management world, like if you think about common, so directory tooling, so stuff like, I know SailPoint does this, I think Okta does this well off the top of my head but there there's a bunch of them that kind of interoperate based on a standard called Skimm. So the system for cross domain identity management I I believe that is.
But Skimm is basically it's an open standard and it allows you to automate user provisioning end to end across all these various systems that also implements Skimm. One of the things that comes along with Skimm is bound schemas to work with in different scenarios and the a p i provisioning service supports both of those. So it gives you the ability to map from the Skimm core user schema and the enterprise user extension all the way back to the attributes in Azure ad.
I think an interesting thing here is even though the payload to talk to the A P I provisioning service is skimm content, like it's actually a specific content type that gets fired into the A P I. So like the content type header on these requests gets set to application slash skim plus json it. It is not a Skimm compliant endpoint. So even though skim is the glue that's potentially holding things together, Microsoft is not fully coming into that ecosystem.
So it's a one-way inbound endpoint. This is inbound provisioning, it's not outbound provisioning and full right back to these systems as well. So I think that's important to call out and you do have to consume a custom a p I here it's not, hey let's go ahead and just fire this up into existing product stacks that natively integrate with Skimm. Like you've gotta do some work to get it going. It is not a standard Skimm endpoint by any definition of the record.
It's also very different from I think graph. If folks are looking at it and they're saying like, oh why wouldn't you use the graph to do this? Fundamentally different things like just the way the APIs are built out and what they're made to do. Like graph as being very o data centric, single user,
not very bulky. This is all just bulk upload, bulk provisioning, get it into this schema, get it into this format, do some mapping for us and we'll make the rest of the magic happen in the in the background. Which I think it's very good to see it.
I wonder how that changes over time too though if it does need to be become something that plays nicer in the ecosystem to get folks to latch onto it or if they just go, oh my gosh, I've been waiting for this so long that I'm going to take it and run with it kind of thing. Yeah, they have a big explanation of why it's not a standardized skim endpoint.
There's a lot of good information about this in the F A Q that we threw it in the discord chat, but if you're listening to this after the fact, it is also will be in all the show notes. If you are thinking about implementing this, definitely go read through these frequently asked questions about the a p I driven inbound provisioning because it's not just a, there's a lot of information we'll leave it at that.
There's a bunch of information in this few frequently asked questions about more details on how it's all working, why it's all working the way it does, all of that type of stuff. Indeed. All right, what else? We we we have time for one more, for one more. You wanna talk about, you wanna talk about Python? We. Can talk about Python. You were doing such a good job with your flow from one to the next and then we just completely band all that and said let's talk about Python ,
but let's talk about Python. I. Ran outta things so , this'll be our, this'll be our last one. There. Was my G P O one. We can do that one next time though. We'll do Python. We'll. Save that one in the list. So Python used all loaf for the place in data science land. It's available in your favorite products of choice.
Like maybe you've spun up Databricks and you're like Ooh, I'm gonna run a notebook over here and get things going and start playing with some pandas data frames and things like that. If you've ever done that on that side and you're like, huh, I really wish I could do this in Excel now you can do pandas data frames and pandas extensions and all the things that come along with that ecosystem within Excel except you're not really doing it in Excel.
And I think this is cool the way they set this one up and get it ready to go. So if you hop over to the latest insider builds of office within Excel in public preview, now there is Python functionality. So rather than having a standard Excel function, you can come over and you can run Python functions which are just using pandas, which is super cool because there's tons of plugins and other things for like charting and visualization and all that stuff or running specific types of
algorithms on top of your data. Like all that stuff exists in conduct. Super easy to pull in. Like that's all supported over here. So if you're a data analyst and you've been pining to have access to data manipulation and working with data frames inside of Excel, I think this is a super cool thing. Just like not only on the data manipulation but also the visualization front.
There's all sorts of other things that go into predictive analytics, modeling, statistics modeling, forecasting, all those kinds of things that now you can also go ahead and run through this engine and if you're already in the ecosystem, like it should be super easy to come over like it's all pandas, it's all Conda, Anaconda and that stack. It should all just work, which is cool. The even cooler thing I think is how it all just works, huh?
Like how did they do all this and how did they bring this functionality down to a local, a local Excel client and make it consistent? And the interesting thing is they didn't bring Python down to the client. They brought the data from the client up to the cloud specifically in the form of a container.
So Microsoft is hosting Azure container instances that are using pre-built packages from Anaconda that then you can go ahead like I said and and absolutely bring in and inject your own packages into that container. But effectively Microsoft is running like compute as a service in the cloud in the form of these Python containers that are all stood up and ready to go that your Excel client can talk to.
So it can pass data to that container that gets sent out via this specific Python function that's now available in Excel. Like it's the Excel function. So Excel, open paren and then close paren and it can just return that data in the form of a data frame that can then talk locally inside of Excel. I think all sorts of interesting kind of things to follow up on here, but I'm excited about the Python part of it.
Anytime I don't have to go and and open a notebook and hop over to a specific workspace or anything like that is really nice. I wonder how much customers will latch onto it once they realize that the units of compute that are running things are actually container instances and sitting out there in a hosted on behalf of managed on behalf of environment within Microsoft land. There is a little bit of an article out about this functionality, I'll put a link in the show notes.
It's called Data Security and Python in Excel but it talks specifically about what these Azure container instances are, where code runs and what your Python code has access to and does not have access to. So I think it's important to go and read through that because effectively like now your Excel client is punching out to random place on the internet to go ahead and put data in and out of your Excel notebooks. Yeah it is interesting that it's doing it all in Azure. So I have a question.
I've not done a bunch with Python, I don't do much data science, this is not on my list to go play with but just from understanding it, is it that the amount of data in the amount of compute that Python would use to crunch this is that it needs something like a container in Azure? Like why build this to run an an A C I instead of building this to run locally?
Or is it just that there would be so many more requirements on how you'd have to set it up locally because, and this came out, I actually had a, a friend of mine that asked this question because in preview it's gonna be included but this is another one that after preview some fun, some functionality they don't specify what will be restricted without a paid
license. And he was like, I don't wanna go learn all of this and set all this up but while it's in preview only to lose this functionality because I'm not gonna get my company to go buy a paid license for me to do it once it comes out of preview. Why isn't there some way that I can still use Python and Excel but run it like using my local compute instead of having this dependency on a C i or maybe you can already do this all locally if you go set everything up.
That was just one of those I was curious about not having done a ton with it. I didn't necessarily have a good answer. I think. Part of it is the ephemeral nature of data science. So if you think about these massive lakehouse and data lakes and other things that kind of sit out there today, right? Uh, there might be some form of like centralized governance that sits on top of that. But quite often you have like disparate communities that are actually working with the data.
Like you might have one set of data scientists working with say say do the medallion architecture, right? You're running everything through like bronze, like raw data, refining the silver, getting it out to gold, blah blah blah. Like all that data is meant for different audiences and for different types of analysis. So quite often while you maintain like the centralized data source, the lakehouse, now let's imagine like your Excel file is the centralized data source. Yep.
The people who consume that data are different audiences and they're gonna go about it in different ways with potentially different tools. So you might spin up like like a Databricks cluster to do some one-off analysis on an already refined dataset 'cause you wanna go and look at it in a different way or there's a different like statistical model you wanna
run against it, something like that. So you'll spin that up, you'll create it, you'll do it all, you'll crunch all the numbers, you'll run through it all and then you'll spin it down 'cause you got the result kind of thing and and I think that ethos of ephemeral analysis that potentially doesn't always need to stick around plus the ephemeral nature of maybe you are doing one type of analysis and I'm doing it another and we're gonna approach that in different ways with a different set of plugins.
All those kinds of different things actually makes a ton of sense with something like container instances which are also quite ephemeral in their nature, right? Spin 'em up, spin 'em down, only use the compute when you need it and then hey the rest of the time I don't need the compute, let's just ditch it out the side and get rid of it. So I imagine like that's a big component of it is the very nature of some of this stuff is ephemeral works end to end if you think about it that way.
The other one is consistency and availability of these things. Like containers make a great way to spin up packaged environments that are the same for everybody at the beginning and then they can deviate as needed but whenever they deviate too far like they can just reset and come back to baseline and start again. That kind of thing. I think it all comes together and works. It's an interesting way to solve the problem and I think it opens up the ecosystem to a bunch of new folks.
So if you're thinking about it from the context of oh my gosh my company is not gonna pay for a license for me to do this in Excel, is your company paying for you to have access to Databricks or Synapse or Snowflake or like one of these other. One of those other ones.
Analytics platforms? 'cause if they are, I'm gonna bet that the cost bakes out to be a lot less to run a container instance to run on top of your Excel workbook than it is to spin up an entire unit of compute around even like a single node Databricks cluster. Got it. That makes sense. And I guess the other aspect of it, I was going through the article here too is they do talk about that it's built for teams so you can also like store these workbooks in
SharePoint. You can put them in Microsoft Teams, which is still putting them in SharePoint, send them via Outlook, which you shouldn't be emailing files via Outlook anyways, but it still has all the comments, the mentions and then as different people maybe go grab these notebooks, to your point, your Excel workbook is your source of data and maybe different people have different worksheets, all of that or other people are working with you on the data,
they'll be able to open it and not all of a sudden run into, my machine isn't set up the same way your machine is from a Python perspective so it's not working. You just open the workbook and it sounds like it's there. I'm guessing it just as soon as someone opens it, it maybe spins up a new a c I instants in the background to be able to do anything with it. They say sensitivity labels work with all of this for protection policies but this also probably better facilitates the portability
of an Excel workbook. I guess they would be curious, does it only call out when it re renders data? Let's say you have the license for this, you go do a bunch of data analytics around the podcast in Excel, you send it over to me or share it with me and I don't have a license, is it just not gonna work for me? It'll be interesting to see how that works too if you share these with people that are licensed differently for Python and Excel. T B D, we'll see, I haven't seen that one yet.
There's a really good video out there. There's this awesome person out there, Layla Ani, she's an office M V P like she's great. She's got a YouTube channel. If you go in you watch like John Savile videos for Azure stuff like just go watch her videos for office stuff. Ooh Excel word one like stuff like I'm not excited about all the time but she has an awesome video that walks through this feature I think in a really great way. Like it takes you through like the baby steps of okay,
let's bring in our first data frame. Let's see how a data frame plays over here. Talks a little bit about what data frames and what they are and let's leave the whole a c I thing and like the technology and the background out of it. Like just here's what Python and Excel potentially means for you with a nice simple dataset. I think rather than folks like getting on their jump to conclusions Matt and trying to worry about licensing and how it all works in the background.
Like maybe just go see if the functionality works for you and the functionality works for you and the values there. It's a lot easier to have that conversation about like downstream, R O I on on licensing and all those kinds of things because if you get like analysis paralysis in the beginning, I think for something like this, like you're potentially missing out what could be a really cool tool in your toolbox. Right?
I wonder if it'll end up being even like a pay as you go type license or you can buy like credits thinking back to the whole spin up a container as units of compute. We've seen more of this pay as you go type stuff coming in through some of the syntax stuff that we talked about earlier too. Like I could see this one even being a pay as you go type model based on usage of those instances. So we shall see, but I do. We'll see it is compute somewhere, right?
Like let's be clear like it is compute that's running all the other compute things. Like you look at like dev boxes and cloud PCs and all this other stuff that runs in the background in Microsoft land. Like part of the reason that stuff costs money is because it costs Microsoft money to run it. Like you're literally firing up a server for some period of time. So keep that in mind too about licensing models and things like that.
Sometimes underlying technical implementation has to drive some of that just to get those wonderful cogs back. Yep. But I do like to that point of the portability, it definitely is a better way to do this I think from a portability standpoint. So who knows what it will end up being, but definitely some cool features and functionality coming to Excel. And with that we should call it a day.
The leaves are still gently waving outside my window with an occasional wild gust or two, but I also have clients pinging me that they have an urgent issue that I should probably go attend to. It. Happens. All right, cool. Go fire up Spotify. Start playing Rocky like a hurricane, like a little bit of scorpions in the morning to get you going and you'll be all set. Should I play it loud enough that I can go stand in the middle of the street with my flag and brave the hurricane?
Be the Florida man . It was funny. One last story. I had a friend that totally did this the other day. I was on a conference call with him, he's down in Tampa and he was like, oh this is a good, a good squall coming in. I'm going outside. And he literally like left in the middle of the conference call to go run outside in the middle of the squall and he didn't come back for 20
minutes. I was like, see outside, like hanging onto a stop sign, blowing sideways or, but I was like, this is, it's definitely Florida. Yeah. Yeah, definitely Florida. That's our new motto now. Definitely. Definitely Florida . Alright, thanks Scott. Enjoy the rest of your day. Hopefully you are able to maintain power throughout the day today and we will talk to you later. All right, thanks Ben. Yep. If you enjoyed the podcast, go leave us a five star rating in iTunes.
It helps to get the word out so more it pros can learn about Office 365 and Azure. If you have any questions you want us to address on the show or feedback about the show, feel free to reach out via our website, Twitter or Facebook. Thanks again for listening and have a great day.