I get the following deprecation warning when saving/loading a gensim word embedding:
model.save("mymodel.model")
/home/.../lib/python3.7/site-packages/smart_open/smart_open_lib.py:398:
UserWarning: This function is deprecated, use smart_open.open instead.
See the migration notes for details:
https://github.com/RaRe-Technologies/smart_open/blob/master/README.rst#migrating-to-the-new-open-function
'See the migration notes for details: %s' % _MIGRATION_NOTES_URL
I don't understand what to do following the notes on the page.
So, how should I save and open my models instead?
I use python 3.7 , gensim 3.7.3. and smart_open 1.8.4. I think I did not get the warning when using gensim 3.7.1. and python 3.5. smart_open should have been 1.8.4.
You can ignore most "deprecation warnings", as they're just an advisory about underlying changes that for now still work, but there's a new preferred way to do things that may be required in the future.
In this case, the warning is about a function inside the smart_open package that the gensim package is using. That is, it's not the .save() you are calling that's deprecated, but something inside .save(). The gensim authors will eventually update .save() to use the newly-preferred variant of what smart_open offers.
You can just keep using .save(), ignoring the message as long as things still work for you – unless you'd like to contribute the fix to .save() to remove the warning to gensim. (It may, however, have already been fixed in the development code, to become available in the next gensim release.)
Related
I'm new(ish) to Laravel and I've been trying to get a clear understanding of how to use Faker to generate my test data. One of the things that's really confusing me right now is the different versions of Faker that I'm seeing in code examples. Some of them use fzaninotto/faker while others use fakerphp/faker (which I see is built into my Laravel 9.41.0 app).
What is the difference between the two packages and which one should I be using? Or will I need both due to differences between what fakes each package can generate?
I'd really appreciate some advice on this matter. I have a suspicion that the fzaninotto version is an older version of the fakerphp version and that I should simply use fakerphp and just think "fakerphp" whenever I see examples that use fzaninotto but maybe that is just wishful thinking on my part....
I moved from sweetalert2 ^7.32.4 → ^8.11.7, I read the v8.0.0 breaking changes and none of my code directly call one of the impacted functions.
I wonder what is causing this mysterious error: (TypeError: this is undefined)
As reported in the SweetAlert2 release note for version 8.0.0 referenced at the top of the README, this major version release introduced a breaking change in the way the Swal is invoked. In the release note there is also a link to the reason why this breaking change has been introduced.
The changelog is actually including this breaking change under the title "BREAKING CHANGE: remove withNoNewKeyword enhancer". The content of this commit explains:
From now on the recommended way to use SweetAlert2 is:
Swal.fire({...options})
I went to see the new documentation examples, and they are all launching the popup with Swal.fire() when before Swal() alone was valid.
The problem is that if you check CHANGELOG.md for fire you don't see any mentions of that function. But this is a breaking change as it was earlier not mandatory and now is.
Here is the fix for my specific problem.
I am creating a PythonAnalyzer using the following code:
var interpreterFactory = InterpreterFactoryCreator.CreateAnalysisInterpreterFactory(
PythonLanguageVersion.V36.ToVersion());
var analyzer = PythonAnalyzer.Create(interpreterFactory);
Later on I also create and analyze a simple python module, that looks like this:
name = input('What is your name?\n')
print('Hi, %s.' % name)
Then I do module.Analysis.GetValuesByIndex("name", 4).
At this moment I expected the "value" to be 'str', because that's what Visual Studio shows when I open the same file in it. However, I get 'object' instead. So it seems that the PythonAnalyzer when constructed as mentioned above lacks some important information about where to look for standard library and/or its types.
Unfortunately, the documentation on PythonAnalyzer is lacking, so I was hoping the community could help understand how to configure it properly.
Congratulations on getting so far :)
What you're hitting here is the fact that CreateAnalysisInterpreterFactory is really intended for "pure" cases, where you have access to all the code that you're trying to analyze and nothing needs to be looked up. It is mostly used for the unit tests, or as a fallback when no copies of Python are installed. Depending on precisely which version of PTVS you are using, the bare information you're getting is either coming from DefaultDB\v3\python.pyi or CompletionDB\__builtin__.idb, both of which are somewhat lacking (by design).
Assuming you have a copy of Python installed, I would suggest creating an instance of InterpreterConfiguration with all of its details, and passing that to CreateInterpreterFactory (without "Analysis").
If you're on the latest sources (strongly recommended), this may run the interpreter in the background to collect information from it (you can control caching of this info with the DatabasePath and UseExistingCache members of InterpreterFactoryCreationOptions). If you are using the older version still, you'll need to trigger a completion DB regeneration or have one that you've created through VS.
And a final caveat: this part of PTVS is currently under some pretty heavy development at time of writing, so you'll either want to keep updating the version you're working against or stick with a slightly older one. Also feel free to post questions like this on the GitHub site, as while this is technically public API, it's barely documented at all and so the best help will come from the dev team.
I knew that modifying core-data previous version breaks the light-weight migration.
But I have inverse relationship warnings on old versions of data model.
I tried removing old versions that are used before launching first version of our app. I followed this SO question, "How to delete an old/unused Data Model Version in xCode 4"
It removes warning(Of course it should)
But I cannot run our app anymore
I think I should keep every versions even though it is not related to light-weight migration
I saw this SO question, "How to disable no inverse relationship warning for CoreData in Xcode 4.2?". And I tried setting the MOMC_NO_INVERSE_RELATIONSHIP_WARNINGS to YES.
It doesn't removes warning
And I don't want my future mistake, missing inverse setting, also be ignored.
I think this option is not for my case
I don't want to see compiler warning for old version of data model(.xcdatamodel).
Because these models are not used so that warnings are not significant.
But I want to see future inverse relationship warning on newer version of data model.
What option can I take?
after upgrading to sass-3.1.8 form sass-3.1.7 I get this error:
Functions may only be defined at the root of a document.
any Idea how I can solve this?
I'm using some of bourbon's mixins and it's imported at the top of my stylesheets, that's all.
I have the same problem and could not solve it by modifying code.
The way I solved was to use an older version:
gem uninstall sass
gem install sass -v 3.1.1
Ok Here is what I come up with:
SASS team decided to make a change (in this case "Functions may only be defined at the root of a document.") that made some plugins incompatible. in my case it was bourbon library. I made a ticket on github homepage of the bourbon and the owner updated the code and released a new version that's working with latest api.
I think this change should have got a bigger version bump to indicate the api change.
Sass developer here. Mixins and functions were never meant to be allowed in a scoped context. A bug was fixed recently that caused them to be caught when in an imported file (before this fix they were only caught if defined in the primary sass file).
That said, it's not a feature we're explicitly opposed to, but we'd would need to properly test it, document it, and support it as an official feature.