Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Windows Compatibility Pack #16

Merged
merged 5 commits into from
Nov 7, 2017
Merged

Windows Compatibility Pack #16

merged 5 commits into from
Nov 7, 2017

Conversation

terrajobst
Copy link
Member

@terrajobst terrajobst commented Oct 25, 2017

When we shipped .NET Core 1.x as well as .NET Standard 1.x, we were hoping to be able to use that as an opportunity to remove legacy technologies and deprecated APIs. Since then, we've learned that no matter how attractive the new APIs and capabilities of .NET Core are: if your existing code base is large enough, the benefits of the new APIs are often dwarfed by the sheer cost of reimplementing and/or adapting your existing code. The .NET Framework Compatibility Pack is about providing a good chunk of these technologies so that building .NET Core applications as well as .NET Standard libraries becomes much more viable for existing code

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

While these will directly help me in real world situations in my day job. Night time future .net me thinks it's sad they will be added. I had hoped that with the addition of NetStandard2.0 the resourcing and energy would 100% be spent in providing forward looking and bold solutions.

It comes to the point when we just say "what's the point?" These apps clearly run on full framework anyway, asp.net runs there too. Some stage you have to say to people you can't move to the new shiny xplat platform because you are using plat specific unsupported APIs.

Anyway again as with 2.0 I get it and like smashing back a lot of whiskey it will make me feel good for a while. But I can't help feeling somewhere there is a metaphorical 3yr old waiting to wake me at 6am to play that will make me realise it wasn't a good idea.

private static string GetLoggingPath()
{
// If we're on Windows and the desktop app has configured a logging path, we'll use that.
if (Environment.OSVersion.Platform == PlatformID.Win32NT)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should probably use RuntimeInformation.IsOSPlatform(OSPlatform.Windows)

Copy link
Member Author

@terrajobst terrajobst Oct 26, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've used Environment.OSVersion (1) because it's the API folks have used for years and (2) it allows for version checks. We've originally explicitly removed Environment.OSVersion because version checks are known to be fragile and designed RuntimeInformation as the replacement for diagnostic information and simple platform checks. However, we've received a lot of feedback that version checks are needed in certain cases, which is why we added the API back as part of .NET Standard 2.0. At this point I see little reason to avoid the existing API.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand why we added it back but that doesn't remove the reasons that it is bad to use. Going for I still think our recommendation is to use RuntimeInformation.IsOSPlatform(OSPlatform.Windows) and we will hopefully get back to designing better version checks into that API as well.

Copy link
Member Author

@terrajobst terrajobst Oct 27, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What makes this particular example better? I'm not opposed to changing the code here, but I'd like to understand he reasoning behind it.

Copy link
Member

@weshaggard weshaggard Oct 27, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will Win32NT be returned on all platforms that support registry? Maybe, maybe not. The point is that checking for an exact enum value is generally error prone as soon as we need to return a different enum value for any other reason (i.e. maybe Nano returns something different). That it why we designed the IsOSPlatform to support platform families or hierarchies, like unix/linux/etc.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More generally, suppose you want to write code that does something specific on Windows and OSX: it's not possible with Environment.OSVersion.Platform today since OSX is never returned by those APIs. You'd need to mix Environment.OSVersion.Platform with RuntimeInformation which is just ugly.

That's why we should recommend RuntimeInformation and not talk about Environment.OSVersion.Platform whenever possible :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, I've changed the code and filed an issue to deprecated the API.

@tmds
Copy link
Member

tmds commented Oct 26, 2017

@terrajobst I find this proposal violates the semantics of netcoreapp and netstandard. Have you considered giving this hybrid it's own tfm, e.g. wincoreapp?
This tfm matches well with the use-case: use the shiny new stuff from .NET Core on Windows with Windows specifics.
On the xplat use-case: I think it's a bad idea to pretend you are xplat at compile time and then blow up at runtime because you aren't.

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

^^^ This anytime an error moves from compile to runtime the cost is several factors higher in test/fix/debugging. I am concerned that libraries could appear that "seem" xplat until you run them.

@onyxmaster
Copy link

While I have a moderate (300KLOC) codebase to port, I think adding such pack isn’t the best idea. People are better off porting their code, probably refactoring it on the way, and any “drop-in” replacements will still be not 100.0% compatible with original implementation. Also, supporting this would use your team’s resources that could be used for working on the future, not on the past.

@daveaglick
Copy link

daveaglick commented Oct 26, 2017

I'll echo the concern by others here regarding the possibility for creating netstandard libraries that only work on Windows and only fail at runtime. I can totally see the scenario in which a developer tries to port to netstandard, can't get it all to work, and then pulls down the compat pack and calls it a day. Too bad for you if you find this library, it targets netstandard and does something that seems like it would be platform independent, and then blows up at runtime (possibly in the field on untested control flows).

That said, I can also see this being valuable in a variety of scenarios. I think the problems arise from properly indicating if a netstandard library is really only targeted at a specific platform. To some extent that's up to the publisher - we probably can't determine that automatically in all cases. However, in this case we do know because the compat pack is a specific library.

I'd propose something along one of these lines:

  • Add support to the NuGet gallery and NuGet UI to call out if a netstandard package is using the compat pack library. A little warning icon or text like "this package may only be compatible with Windows".
  • Don't put the compat pack on NuGet. I know that sounds a bit drastic, but by forcing the consumer to manually download and reference the library it'll be crystal clear that they're opting-in to APIs that will probably break on non-Windows platforms. The tricky thing here will be to make sure the compat pack doesn't just end up embedded in the packages of libraries that use it - something like a development dependency requirement or NuGet packing blacklist (this part may not be feasible).
  • Solve the broader problem of identifying netstandard packages that aren't cross platform by adding extra metadata to the NuGet file to indicate such. Even better would be if this could be transitive so that if a consumed package indicated it was Windows-only (like the compat pack) any consuming packages would also automatically indicate such.

@danmoseley
Copy link
Member

danmoseley commented Oct 26, 2017

Couple of folks expressed concern about the value being worth the cost/distraction. I do'nt think that's a problem. The ports are generally not difficult and we don't expect them to need much maintenance as we don't expect to add features or otherwise churn the code. As for value, we've gotten consistent feedback (from direct engagements, not just github votes) that porting these old API will help customers migrate progressively to .NET Core. It's very common of course for an enterprise to have a lot of legacy dependencies they don't want to modify or replace all at once.

The key conversation in my mind is marking them as netstandard. I'll let @terrajobst continue that conversation.

[edit - reworded]

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

Resourcing doesn't necessarily mean $$ ;) Often its a discussion around mind share and brain power. As you say maybe its not a big deal here if it's a direct port.

The pollution of the .net core/standard ecosystem with a bunch of legacy platform dependent API's concerns me a lot more, and I say this as someone with 2m + LOC of "legacy" C# in a single department and I would hate to think how much across the Org.

It still stands in my mind that if you want .net standard features, and hey even ASP.NET Kestrel etc but want to stick with windows perf counters or the registry, then compile against 4.62+. If you want to make a library that is xplat, but if def's a bunch of non xplat features, then cross compile a .net standard + a net462 version....

We have a lot of projects like above, and cross compiled libraries as we are moving, its a perfect marker for what is and what isn't xplat (most is still good old c# code ). I don't want "core compatible" code that blows up if I try to run it on Linux all of a sudden.

EOD of course there are hidden commercial considerations like you say, and I respect that the decision you make will be of best intentions, just a reminder that the people at the "top" that you talk to in these orgs (eg massive corps) are often the squeaky ones as well, just like on GH etc and there are a lot of dark matter devs who think the current situation is fine.

@i3arnon
Copy link

i3arnon commented Oct 26, 2017

porting these old API will help customers migrate progressively to .NET Core

As an example, we at Microsoft ATA definitely want to move at least some of our codebase over to corefx even without going x-plat. We're actively clearing the hurdles we can but some (e.g. https://github.com/dotnet/corefx/issues/2089) are harder than others.

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

Why do you want to move to core then? What's the upside ?

@bording
Copy link

bording commented Oct 26, 2017

Is the decision for what's included in the Compatibility Pack still up for debate? For example, I'd like to see the System.Messaging namespace included so that MSMQ could be used from .NET Core.

@i3arnon
Copy link

i3arnon commented Oct 26, 2017

Why do you want to move to core then?

Well, we would like to benefit from the all the improvements and additions to corefx. But more importantly we install and frequently upgrade our sensors on many of our customers' servers. Currently that means installing the latest .NET framework which may require admin credentials and possibly restarting the machine. However, updating a self-contained corefx app only requires xcopy.

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

@i3arnon Great response! (In case it didn't come across like that, I was actually interested in why you wanted to move not just being annoying ;) ).

The self contained, and lack of needing deployment is def an upside that we see for core (more even than xplat), and perf of course. My concern is being played out here though, as now MSMQ is being asked for.

So now there is a risk that some .net core apps might not even run on Windows Nano, let alone xplat ;)

Anyway glad I don't have to make these decisions... 😆

@terrajobst
Copy link
Member Author

terrajobst commented Oct 26, 2017

@Drawaes @tmds @onyxmaster @daveaglick

[Being able to access platform-specific APIs violates NET Standard]

We have had to add APIs to .NET Standard that don't work everywhere, because they happen to be on types that we needed in .NET Standard (example: Console.WindowWidth). However, in this case we weren't constrained by that. We could, in fact, decide that Windows only technologies cannot be installed into a .NET Standard library. A logical extension would be that this would also require .NET Core application authors to be RID specific (i.e. specify an operating system) so that we can fail to install the packages into .NET Core applications unless you're targeting Windows.

However, I just don't think that's a good developer experience. As outlined in the doc there are cases where you want to call "one more" API in order to tailor your library to a particular operating system (currently it's only Windows, but this could equally apply to Linux or Mac specific APIs). Forcing people to multi-target is doable, but it does add quite a bit of complexity. Now, for libraries you have at least the benefit of NuGet where you can encapsulate the fact that you compiled for multiple TFMs/RIDs. But for applications that starts to suck a lot more because you now have to deploy different outputs, depending on the operating system (which is always true if you're self-contained but not required if you're targeting the shared framework). Again, it's not impossible but it seems to invite friction.

Let's step back for a second and let's see what the problems are:

  1. Dependencies. You may not like the idea of authoring a library or application that will deploy assemblies on operating systems where the API will always throw.
  2. No timely feedback. You don't want to call platform-specific APIs by accident only to discover very late that they in fact do not work.

Let me first address (1): nothing prevents you from multi-targeting and using #if to remove the dependency from non-Windows implementations. For example, if you're building a NuGet package, you can simply target net461 and nestandard2.0 and only add the package dependency to the net461 configuration. In your code, you'd simply condition all .NET Framework specific API usages using #if. Or, you might decide to not target .NET Standard at all.

The second argument can be addressed differently. For starters, we don't plan to add large chunks of APIs to .NET Standard that are platform specific. That's why we went through great lengths to extract things like the registry from it. So you project has to add a package dependency explicitly.

Secondly, we're are working on a diagnostic story that will tell you early that you're calling platform specific APIs. For the example I provided in the proposal this would look like this:

image

The idea here is what you're supposed to suppress these warnings in methods after verifying that you guarded them correctly (we could try to detect these cases but it's generally quite hard to do that). You could also decide to configure our analyzer to make these warnings errors and prevent the build from succeeding. This way, you can be sure that you have to think about it and force the code to be modified to explicitly call out the instances where you're OK with calling APIs that don't work everywhere.

image

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

Will the errors be transitive? And could it be possible as mentioned above, maybe these warning could be summarized in nuget?

@terrajobst
Copy link
Member Author

terrajobst commented Oct 26, 2017

Will the errors be transitive? And could it be possible as mentioned above, maybe these warning could be summarized in nuget?

We currently don't do that because it's not easy to do efficiently.

However, I also question whether that's sensible. My argument would be: the library author is responsible for guarding the calls correctly. If they don't, it's a bug on their end. It's the same with other APIs (like don't pass null, don't call Stream.Write if the stream is read-only etc).

But for auditing you could use API Port to check what your application (and all its binaries) is doing.

@danmoseley
Copy link
Member

danmoseley commented Oct 26, 2017

@bording

Is the decision for what's included in the Compatibility Pack still up for debate? For example, I'd like to see the System.Messaging namespace included so that MSMQ could be used from .NET Core.

We want to release this next month so the content is pretty locked in. We could certainly release again with more packages in, if there's demand especially if there's volunteers to help with the porting/tests. (Please make sure there's an issue tracking what you would like to work on .NET Core, and solicit votes on the top comment. Mark it port-to-core.) I do expect that the great majority of our resources though will be on future looking work after this though.

@terrajobst
Copy link
Member Author

terrajobst commented Oct 26, 2017

@bording

Is the decision for what's included in the Compatibility Pack still up for debate? For example, I'd like to see the System.Messaging namespace included so that MSMQ could be used from .NET Core.

We can still add to the pack. The current composition is a result of (1) what we already had and (2) what customers filed requests for and how many we got. Looks like we missed this item for the current version, but I've now marked it properly.

@terrajobst
Copy link
Member Author

terrajobst commented Oct 26, 2017

Sorry @danmosemsft you must have submitted before I completed editing mine. We seem to think alike though 😉

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

APIPort is only online? Or is there an onprem version?
As for the nuget/ transitive concept I was hoping and old fashioned public shaming would restrict their spread , maybe I have been in England too long if I am thinking like that.

@terrajobst
Copy link
Member Author

@Drawaes

APIPort is only online? Or is there an onprem version?

API Port is available as command line tool, but it requires an internet connection to talk to the service. AFAIK there is no offline version. The analyzer I show here is fully offline capable though.

As for the nuget/ transitive concept I was hoping and old fashioned public shaming would restrict their spread , maybe I have been in England too long if I am thinking like that.

I was considering allowing package authors to declare what they tested on/support.

@daveaglick
Copy link

I was considering allowing package authors to declare what they tested on/support.

I think this is the best compromise between platform-specific functionality and not unknowingly inheriting runtime exceptions. It's not as cumbersome as multi-targeting or multiple TFMs and wouldn't need to be enforced by the build tooling. Just a little bit of metadata in the package saying "I support these platforms..." (the absence of which would indicate broad platform support). That could be displayed in the gallery and the VS NuGet GUI.

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

I mean, even the local version sends the code... "elsewhere"? If so its a no go for the industry I am, or even perhaps illegal to send the code... outside.

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

(the absence of which would indicate broad platform support). That could be displayed in the gallery and the VS NuGet GUI.

I could almost get onboard that if there was

  1. You specified platforms..
  2. You didn't say anything
  3. YOU ACTIVELY said broad support

In the metadata, so basically there was a "null" which is every package today. I would prefer if people actually said "look I am pretty sure I am good on xplat" rather than assumed.

@terrajobst
Copy link
Member Author

terrajobst commented Oct 26, 2017

[@Drawaes] I mean, even the local version sends the code... "elsewhere"? If so its a no go for the industry I am, or even perhaps illegal to send the code... outside.

No code is sent anywhere. The tool dumps the list of all the framework APIs you call and sends that to the server. We do not send any of your assemblies or IL to the server. Here is what we said in our blog post:

[@terrajobst] Don’t worry — we will not collect any information about your code. You don’t have to take my word for it: API Port is open source and is hosted on GitHub.

[@Drawaes] In the metadata, so basically there was a "null" which is every package today. I would prefer if people actually said "look I am pretty sure I am good on xplat" rather than assumed.

Yeah, I think there would be three values: no statement, specific set, evertyhing.

@Drawaes
Copy link

Drawaes commented Oct 26, 2017

Okay in that case I can look at using it in our CI 👍

@terrajobst terrajobst changed the title Add proposal for compat pack .NET Framework Compatibility Pack Oct 26, 2017
@tmds
Copy link
Member

tmds commented Oct 27, 2017

A logical extension would be that this would also require .NET Core application authors to be RID specific (i.e. specify an operating system) so that we can fail to install the packages into .NET Core applications unless you're targeting Windows.

Or it's own tfm: wincoreapp.

Forcing people to multi-target is doable, but it does add quite a bit of complexity.

Under it's own tfm, it's not multi targetting.

But for applications that starts to suck a lot more because you now have to deploy different outputs, depending on the operating system

Again, not a problem under its own TFM.

nothing prevents you from multi-targeting and using #if to remove the dependency from non-Windows implementations.

If these libraries are not netstandard, then that facilitates multi-targeting.
The compiler will help you figure out where to put the #ifs:

private static string GetLoggingPath()
{
    // If we're on Windows and the desktop app has configured a logging path, we'll use that.
#if WINCOREAPP2.0
    {
        using (var key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam\AssetManagement"))
        {
            if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
                return configuredPath;
        }
    }
#endif

    // We're either not on Windows or no logging path was configured, so we'll use the location
    // for non-roaming user-specific data files.
    var appDataPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
    return Path.Combine(appDataPath, "Fabrikam", "AssetManagement", "Logging");
}

For starters, we don't plan to add large chunks of APIs to .NET Standard that are platform specific. That's why we went through great lengths to extract things like the registry from it.

The effort was to ensure 'netstandard' did not include platform specifics. Is this 'compat pack' not undoing parts of that effort?

Also, if you weren't doing this under the 'netstandard' name, I wouldn' care how much you added to it. You could add everything to from .NET framework that can run on .NET Core. Now the scope is "we don't plan to add large chunks". Plans may be flexible in a bad way (cfr @danmosemsft reply to @bording). This may dilute the meaning of netstandard over time. What is the scope of the compat pack?

Next, suppose netstandard2.1 includes the registry (emulated on non-Windows), how does that impact the compat pack?

Secondly, we're are working on a diagnostic story that will tell you early that you're calling platform specific APIs. For the example I provided in the proposal this would look like this:

VS looks really user-friendly, but I am not a VS user.

What I like: compiler warnings for every library I include that is or uses a compat library.
A tool/analyzer that doesn't work offline or only on Windows is 👎👎👎👎.

Yeah, I think there would be three values: no statement, specific set, evertyhing.

👍 on making it possible to express a library works xplat even if it is referencing compat libraries.

Summary:

  • I'm worried what this means for netstandard
  • We need compile time warnings for usage of libraries which are known not to work xplat when targeting netstandard/netcoreapp

@gulbanana
Copy link

gulbanana commented Oct 27, 2017

My organisation's main scenario: we develop applications on windows, then run some of them (asp.net core servers) on either windows or linux. Often it's unpredictable which server platform will ultimately be chosen as clients have their own servers or cloud setups. I am definitely concerned about netstandard 'not meaning netstandard' - please try to ensure that when feasible, non-portable packages do not compile in projects which are meant to be portable. We don't have the resources to test on all target platforms from the start of development and may not even know what the ultimate target platforms will be.

@Drawaes
Copy link

Drawaes commented Oct 27, 2017

^^ That is my situation, however I think with APIPort and some smart CI I might be able to ensure non of our apps "regress". Basically I am hoping to build a "gate" once an app makes itself xplat, it stays xplat...

I think if I can get the tooling right, then I am generally happy, if I do get the tooling working and its messy then maybe that is something the community or MS can help tighten up. It might also be a nice "scan" check that could feed the above mentioned Nuget metadata...

Eg I imagine "you add this nugget package", and as part of your MS Build/Package step it runs an APIPort, and then updates the meta data to say

"Will actually not work on OSX because you used feature x"

@terrajobst
Copy link
Member Author

terrajobst commented Oct 27, 2017

@tmds

Or it's own tfm: wincoreapp.

A target framework is a very heavy handed solution. Think about what this means: it creates a separate entity that can be targeted by apps as well as libraries. If you're in wincoreapp, can you reference components targeting netcoreapp? If the answer is no, it means library authors targeting netcoreapp would also have to target wincoreapp to make sure your libraries can be consumed by developers targeting wincoreapp. The only sensible answer is yes, but this results in a whole bunch of fragility. For starters, it means NuGet would have to establish a compatibility relationship between wincoreapp and netcoreapp. Which versions would be compatible with each other? All of them? Or are specific versions mapped? We used to have a ton of code in NuGet that did these mappings and it turned out be too hard to maintain and too hard for customers to understand what' legal and what isn't.

We don't want to do this moving forward. We want very simple TFM rules: you can only reference your own TFM and .NET Standard. In other words, .NET Standard is the only TFM NuGet has to map. Nothing else.

Also, what does that mean for .NET Standard? Presumably you'd do a wincoreapp TFM to disallow .NET Standard libraries from being able to call Windows-only technologies. That would mean that .NET Standard libraries must use multi-targeting to use Windows-only APIs. In practice it means you'd need at least to target three: netstandard, net, wincoreapp. What happens for Mono on Windows? Unity on Windows?

Because of all that complexity I consider a separate TFM a non-starter.

The effort was to ensure 'netstandard' did not include platform specifics. Is this 'compat pack' not undoing parts of that effort?

No, because the compat pack isn't part of .NET Standard. Think of the compat pack as .NET Standard library that uses the multi-targeting you proposed as the solution above; it just has an #else throw new PlatformNotSupportedException();.

In other words, even with your proposal, anyone can still write a .NET Standard class library that doesn't work on Linux. So we'd build a very complex machinery that still doesn't protect you from the things you'd like protection from. My approach goes like this: accept that it is possible and provide good tooling and guidance to make it easy to do the right thing.

Easy things should be easy, advanced things possible, and the effort should be proportional.

@tmds
Copy link
Member

tmds commented Oct 27, 2017

Because of all that complexity I consider a separate TFM a non-starter.

Thanks for giving your view.

@terrajobst
Copy link
Member Author

terrajobst commented Oct 27, 2017

@tmds

I don't mean to come across as dismissing your arguments, by the way. I might come across as having all the answers, but that's just because I once believed we can model the complexity of the .NET API surface in such a way that APIs only show up where they should. We called that "contracts". Since then, we've learned the hard way about all the complexity in both, tooling, as well as concepts, that creates too much cognitive load for developers to reason about. Of course, this doesn't mean that it's impossible, but I've yet to see viable alternative to the current POR.

Summary:

  • I'm worried what this means for netstandard
  • We need compile time warnings for usage of libraries which are known not to work xplat when targeting netstandard/netcoreapp

We do share your concerns and we do believe it's key to keep .NET Standard focused on APIs that work everywhere. That's why the compat pack isn't part of .NET Standard. And the parts that can't work everywhere should never become part of it. For instance, I'd consider adding the registry to .NET Standard a really bad idea (because it's Windows only). Same with reflection emit (because not all platforms support runtime code generation). Now, we may add some of the functionality (for example, let's say if System.IO.Ports works everywhere, then we could just add it).

A tool/analyzer that doesn't work offline or only on Windows is 👎👎👎👎.

Totally agree. it's also not just about Windows, but also about command line builds and CI servers. The Roslyn analyzer whose screenshot I shared earlier isn't tied to VS nor to Windows. It uses the generic Roslyn analyzer infrastructure, which injects diagnostics into the compilation pipeline. These warnings will not just be squiggles in the IDE, but also generate warnings on the command line. And if you want to, even errors. Since it's part of the compilation, it runs everywhere the compiler runs today.

What makes them better than previous attempts (like FxCop) is that are also providing live feedback as you're typing in the IDE. While OmniSharp doesn't support this yet, I know they are interested in providing this experience too.

@terrajobst terrajobst changed the title .NET Framework Compatibility Pack Windows Compatibility Pack Nov 7, 2017
@terrajobst
Copy link
Member Author

It seems we've reached a consensus and can consider this design as approved.

@terrajobst terrajobst merged commit 1fd98f5 into master Nov 7, 2017
@terrajobst terrajobst deleted the compat-pack branch November 7, 2017 03:32
@ewinnington
Copy link

So no news on System.Xaml ?

@terrajobst
Copy link
Member Author

So no news on System.Xaml?

We've scoped out the desktop application models. One could argue that System.Xaml is independent of it, but usage outside of WPF and WF seems too low to warrant that much investment at this point. Of ocurse, this doesn't mean never but so far it has always been displaced by other items on the backlog.

@ewinnington
Copy link

At this point, it is becoming a Self-fulfilling prophecy. System.xaml access is needed to make meaningful steps for WFCore and would help the people building Xaml based front ends.

Any news on the system.xaml on reference source then? So that we could port its code for an implementation for wfcore and others?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.