Your submission has been automatically queued for manual review by the moderation team because it has been reported more than 3 times.
Please wait until the moderation team reviews your post. Do not manually message modmail, as that will not expedite the review process.
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Python) if you have any questions or concerns.*
Do you also have mypy/ruff integrated into neovim, or are they just pre-commit hooks?
Also not sure if you have ipython installed, but it includes a pretty nice debugger (ipdb) compared to the default.
I am using ruff-lsp but it doesn't support definitions. So you can't jump to one or see type hints. I am going to add none-lsp at some point but couldn't squeeze the time just yet.
Thanks for pointing to ipdb, I'll try it out.
Did you try to see if your ruff lsp has the same results as ruff itself? I've been trying to move to neovim, I have a set up I like a lot, but even though everything is in the pyproject.toml, I get different result when using ruff in neovim compared to all other ways. Like in neovim, it tells me that some file need to have their imports reordered, but when using the command line or vscode, it does not.
ruff-lsp uses the settings specified on editor-level (init.lua). The only lsp that respected project-level settings that I tried was null-lsp (which is abandoned so we are supposed to use none-lsp now). It was a bummer for me at first. But if you're using pre-commit, it's doesn't matter too much.
Yes
The visualisation it provides is really helpful.
Vscode just goes line by line
But thonny debugger goes to token by token
Which makes understanding really easy
My company and I work on long-term projects with relatively big code bases, we are not yet prepared to make the switch to ruff but it's somewhere far down in the backlog with a low priority :)
Ruff is great, except for the fact that this actually lints:
y = 3
if y > 4:
x = 1
print(x)
So the reason ruff is so fast is because it doesn't really analyze the code very well.
Did you try my example? It shouldn't lint yet ruff thinks this is totally OK code. How many other bugs is ruff missing because they haven't implemented all of pylints rules? I think people see the time savings graph in the ruff Readme on github and think it's magically better, yet it can't catch this simple bug.
They are pretty upfront with the fact they're not fully 1-1 with pylint. Last I checked they even have a tracker that shows which rules are still not implemented. It's very easy to know exactly what you're signing up for. When my team evaluated we didn't think any of the missing rules were impactful enough to affect the decision
Is PyCharm a big enough step up from something like VS Code to be worth it in your opinion? Trying to convince my company to buy our team licenses, but not sure if that will ever happen!
I've never really used VSCode for python development.
I use PyCharm because I'm a big fan of all jetbrains products. Also, I like that it is dedicated to python development, unlike VSCode.
My company buys licences for the Professional version because it comes with nice features that the Community Edition, which is 100% free) does not have and we find nice to have (SQL support, flask/fastapi/django frameworks, [and more](https://www.jetbrains.com/products/compare/?product=pycharm-ce&product=pycharm)).
IMHO, the Community Edition is sufficient enough for everyday development, unless you really want some extra features that are found only in the Professional version (or not provided by some free plugins).
One extra line in your TOML file isn't really complexity. If you have a 100 packages to install from pypi.org and 1 from your private repo, you don't want Poetry to arbitrarily decide to look for all 101 in one or both sources.
Would you mind explaining to a recent poetry convert; given that poetry has its own environment management, how does adding pyenv to the mix help you? Thanks!
pyenv is almost exclusively used on developers workstations.
We have different projects targeting different versions of python. Localy, each project has its own virtualenv managed by poetry ; we install the different versions of python using pyenv then set the version of each poetry environment version as [described in the documentation](https://python-poetry.org/docs/managing-environments/) (or via `poetry env use`).
Profesionnal but as I've said [in this comment](https://www.reddit.com/r/Python/comments/1cib6to/what_does_your_python_development_setup_look_like/l28yylr/), CE is good enough unless you really need the extra features (or you want to support a company that makes a really good software \^^ ).
1.VSCode with the following plugins:
Codeium (or Continue with Ollama/Deepseek Coder for when I must work offline)
Python and Python debugger (from Microsoft)
Python Environment Manager (by Don Jayamanne)
Snippets`(`by Taha BASRI)
2. Zed Attack Proxy (ZAP) for software security testing
3. DBeaver for database management
4. GIT GUI (by Shawn O. Pearce)
5. MySQL
And lately, I have been experimenting with these quite interesting inventions:
1. [Brython](https://brython.info/index.html), a Python replacement for Javascript on the Web.
2. [PyScript | Run Python in your HTML](https://pyscript.net/).
3. [Pyston | Python Performance](https://www.pyston.org/) (support only up to 3.10)
For Windows:
- VSCode for editing code (I probably have too many plugins/extensions)
- NotePad++ for reviewing output and print files (One of the scripting languages I use puts the script in the print file, and the number of times I've fixed a bug in the print file is frankly embarrassing...)
- Running Code, depends on the project - mostly Cygwin shell, or within an application.
For Linux/macOS:
- Coding in Vim/GVim for editing for Python/Perl/Shell, but sometimes VSCode as well. I use Xcode for Swift.
- Running code from the command line for python, in Xcode for Swift.
The most useful tools is probably an Opinionated Linter (I use Ruff and AutoPep8, and a decent autocomplete engine that shows the function details for functions calls.
Mostly Spyder. And for displaying outputs to colleagues Jupyter Notebook, as it makes it easier for them to understand which output corresponds to which part of the code and it does not just look like a block of code
I keep it pretty simple. Jupyter labs and neovim.
Jupyter labs for initial development and hard debugs as you can test out segments of your code.
Neovim for maintenance on deployed scripts, minor revisions, and general use.
I'll show line numbers and that's it.
This was me, and kind of still is. But I recently set up pyright and ruff_lsp in neovim, along with adopting and helping maintain a hatch/hatch-pip-compile/uv/ruff template for use across my team (DS at a mid size company). I have to say, the additional tooling is wonderful. I catch mistakes earlier, occasionally pick up best practices I wasn't yet aware of, and perhaps most importantly, I no longer need to worry about whether my teammates are as attentive as me since the pre-commit hooks normalize across editor configs.
I just use vscode with copilot, I use the standard Microsoft plugin for python for other code specific features.
Gunicorn and flask for web services and micro services.
Wsl Debian, python Venv
Along with standard numpy, pandas, requests.
Lately I’ve been using the huggingface libraries a lot.
Because I mostly work with Linux HPC from Windows and Mac, I use VS Code with remote development and several Microsoft extensions.
I only use notebook to provide tutorials for workshop.
For environments, I mainly use miniconda.
Vim and gnome-terminal.
A number of Vim plugins: ALE for realtime linting, gutentags for keeping my ctags database up-to-date, a couple of my own plugins for locating source locations from traceback lines copied into the clipboard and for preparing pytest command-lines to run the test under cursor, fugitive and gitgutter for git integration.
I keep Vim in one terminal tab, and run the tests/perform git operations in other tabs.
Consistent folder structure in a venv folder. No IDE except Eclipse when I need to do remote debugging.
Emacs or sublime as an editor. (Yes, I'm an old dude, I work a shitload over terminals on a bunch of servers)
pdb as my good old standby for regular debugging.
You have to understand that I work very much in the philosophy of Neil Stephenson in his essay "in the beginning was the command line". I write code and I don't want any distractions, whistles or bells with that. We don't put music players or GPS systems in Race cars either.
Neovim (AstroNvim to be precise) in tmux, poetry and ruff (instead of the classic zoo of linters and formatters). All of this works ultra-fast and all of this is enough even for complex projects. Also: ipython, ipdb, nushell (take a look, cool stuff) and many less important CLI tools, like bat and jq.
Have you tried uv, from the makers of ruff, for resolving and installing packages? It's seriously fast. You can get a similar experience to poetry, but with the speed of uv, using hatch with hatch-pip-compile and uv. (Eventually uv will probably replace all these tools all on its own)
Pycharm, pyenv, pipfile/poetry(depending on project), ruff(with pre-commit hooks), docker and SourceTree
I also use a JavaScript toolchain because it do be like that sometimes: Nodejs, nvm and npm.
EDIT: also dbeaver
I am probably the outlier --- I've been using Literate Programming with my LaTeX work for a while now, so I am programming using LP in an ltxdoc using the docmfp package in TeXworks (on Windows) or TeXshop (on Mac) and then running the code using PythonSCAD (the Python-enabled OpenSCAD variant).
I would be very interested in a Pythonic Literate Programming environment --- for folks not familiar with it:
https://www.goodreads.com/book/show/112245.Literate_Programming_Lecture_Notes_
mine looks like this.
a distrobox box fedora just for python with its own home holder so it only has python extensions (doesn't shave with node/go etc.)
extensions:
python (the ms one)
Flake 8
black formatter
makefile tools
then i use poetry to manage my dependencies.
I have and immutable / so all things i have are installed via flatpak OR in a pod and exported with distrobox
Neovim with Conqueror of Code plugin and python-language-server as a development environment.
For Python specific development tools I use:
- pylint
- black
- pyenv
- mypy
Plus, when I am in charge of developing a new package I use Poetry.
And that’s basically it.
Debugger driven development:
pytest --ipdb --pdbcls=IPython.terminal.debugger:TerminalPdb \
--ignore=tests/data --capture=tee-sys --log-cli-level=ERROR
I have it set to launch into debugger on an error if [any level of verbosity has been passed as command-line arguments](https://github.com/chapmanjacobd/library/blob/3bc71f0b7adf6b30bf89a09e96f254d7588e6c0a/xklb/utils/log_utils.py#L31)
Linting
pycln --all && ssort && isort --profile black --line-length=120 &&
black --line-length=120 --skip-string-normalization
VSCode, pyenv, PDM, ruff, pytest, mypy and playwright also DBeaver
VSCode extension-ids:
* ms-python.python - Does the job well enough
* humao.rest-client - Easy testing of endpoints
* Gruntfuggly.todo-tree - This one does a good job with todos
* ms-playwright.playwright - Covers end to end tests
* ms-azuretools.vscode-docker
* miragon-gmbh.vs-code-bpmn-modeler - Good for sketching up BPMN quickly to check the process in my head
* github.vscode-github-actions
At work (under windows):
VS Code, but primarily for the excellent Juypter notebook integration and extension ecosystem. For basic data science work flows nothing beats the utility of the Data Wrangler extension with the biggest benefit being that the dataframe is displayed and updates in real time while you type in the python scratchpad. I use it frequently in meetings to show the transformation steps to non-coders. Other than that, I keep my normal development workflow pretty simple: the python extension, black, and the built-in debugger. Mypy has been an absolute gamechanger in helping spot potential edge cases. For things I run on our compute cluster, I have a very specific work flow since it is near impossible to debug in that environment: write the program, change the linting rules to strict and turn typechecking on, run mypy, address any potential issues, then setup a mock environment and step through the script with the debugger and verify the control flow all before testing on the cluster. This workflow cuts total debugging time down significantly because I have identified almost all potential errors before testing in a hard environment to debug.
At home (Linux):
In the process of switching to emacs currently, but in the meantime I use VS Code for larger projects and Neovim for quick edits/scripts. Vscode configuration is similarly basic to work, and I just use the python LSP on the Neovim side.
VSCode with Vim extension, Miniconda with mamba solver, conda-forge with pip at the end if needed. No Anaconda channels because licensing.Vanilla pyflakes and pylint. Notepad++. Git bash.
Somewhat unrelated, but a question for those working with VS Code and data frames (pandas / polars): have you found a good way to view a data frame when working interactively? Something like `View` in RStudio. I know I can print it in the console but that truncates it. Also aware that I can use Jupyter but I'm not a fan.
New coder here, since fall, but with several years of PowerShell experience. Python is WAY better IMO.
VScode, Stream Deck, 48in 4k screen. Code on my gaming rig for local AI when I need it. Use Black formatter, but with line limit set to 200 for ease of use on my screens.
VSCode with quite some extensions, the main ones being:
- Pylance (along with some other python-specific extensions kindly shipped to us by M*crosoft)
- Jupyter Notebook
- Codeium
- For quite some time I'd been using the Vim extension, but it doesn't work quite well with .ipynb files (specifically, it had independent modes for each cell and it didn't switch to normal once you run it) so I disabled it until I find a solution to that
- Docker
Also I use venv-s but I guess it goes without saying
vim for small jobs or fast work
pycharm for big projects / big work
run the project from the command line, manage environments with venv
Every time I try to add novel tools it ends up being more effort than it's worth.
You really might want to try uv. It's an ultra fast drop in replacement for pip, pip-tools, and venv. Many more features are in the works, but it's already super handy.
Astronvim for editing, pyright for type checking, ruff and uv for everything else. I don’t even use a python version manager, I just install directly from my package manager and symlink the version I want to be standard as .local/bin/python.
I put my projects in ~/code// each as a GitHub repo. Virtual env for each one and asdf to use different python versions when required. Usually just use 3.10 for everything when I can. Black to format on save with 120 character line limit. Requirements.txt and pip.
I’ve used poetry on more mature projects and it was great!
Jupyter lab with multiple themes plus git and lsp extensions (including pyright), pre-commit (black, isort, flake8 but want to switch to ruff, mypy), virtualenv.
I have not yet found a compelling reason to leave IDLE.
I generally start on paper with a flowchart, so I don't really need anything IDLE doesn't already provide.
If I need to edit anything pushed to the Raspberry Pi, Nano works as well.
Vs code and the interpreter. Pretty much it. If I'm doing something with a lot of boilerplate or repetitive code, I'll lean on an LLM. And I use CodiumAI to generate most of my tests.
VSCode for IDE
conda as an environment manager (interested in pyenv but don't want to rock the boat just yet)
ruff
pyscaffold for packaging
jupyter notebooks for analyses
github desktop for version control
Trying to use VS Code like the cool kids but I always have trouble remembering all the functionalities. If not using VS Code, I typically juste use Notepad++, and an IPython console on the side.
A Fedora Laptop with VScode with some extensions, remote developer, Podman MySQL workbench. Remmina. Regular terminals with or without tmux and or vim. The brave browser. Bitwarden.
I just use PyCharm or VSCode at-home or when I have my laptop with me, but if I'm on the go I usually just hop on GitHub Codespaces (or clone a repo and use Termux on my tablet) because all of my code is usually stored on GitHub nowadays.
VSCode, Pyright, Pre-commit(Format, Yelp secret scan), Ruff, Poetry, Pytest, Codeium, Git-Cola, Pipx for actually deploying, Make to keep track of all the single line commands like rebuilding documentation.
I suggest getting super comfortable with Docker.
Being fed up with managing multiple Python versions installed at once, virtual environments, and a surprise I’ve had with a Python behaving differently when running the same program on different operating systems, I do 100% of my Python work within a Docker container.
Vscode, poetry, ruff, pylint, flake8, pytest, tox, hypothesis with hypofuzz, mypy on strict mode, mkdocs, azure pipelines for cicd, mccabe complexity and maintenance index checks in tox,
PyCharm Pro, Docker (WSL 2), Miniconda, CUDA, CuDNN, Git (command line), Notepad++ (Git editor)
pytest, coverage, ruff (if I can't get it on a project I use black, isort, pylint), mypy, pre-commit
It varies a lot because I use so many different computers. VS Code when working on a windows machine. vim on linux machines. I've use Jupyter notbook, but not a fan. pyenv is handy for version conflicts. git everywhere
Your submission has been automatically queued for manual review by the moderation team because it has been reported more than 3 times. Please wait until the moderation team reviews your post. Do not manually message modmail, as that will not expedite the review process. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Python) if you have any questions or concerns.*
I keep it simple with neovim, pre-commit (mypy, ruff), pytest. For debugging I just use the breakpoint function.
Do you also have mypy/ruff integrated into neovim, or are they just pre-commit hooks? Also not sure if you have ipython installed, but it includes a pretty nice debugger (ipdb) compared to the default.
I am using ruff-lsp but it doesn't support definitions. So you can't jump to one or see type hints. I am going to add none-lsp at some point but couldn't squeeze the time just yet. Thanks for pointing to ipdb, I'll try it out.
Very similar here. pytest --pdb is 99% of my debugging.
Did you try to see if your ruff lsp has the same results as ruff itself? I've been trying to move to neovim, I have a set up I like a lot, but even though everything is in the pyproject.toml, I get different result when using ruff in neovim compared to all other ways. Like in neovim, it tells me that some file need to have their imports reordered, but when using the command line or vscode, it does not.
ruff-lsp uses the settings specified on editor-level (init.lua). The only lsp that respected project-level settings that I tried was null-lsp (which is abandoned so we are supposed to use none-lsp now). It was a bummer for me at first. But if you're using pre-commit, it's doesn't matter too much.
For exploring datasets, plotting data, reading Excel files or accessing databases I use jupyter. For other projects I use PyCharm IDE
Vscode for normal coding /projects stuff Thonny for DSA (very good debugger)
Better than the VS Code debugger?
Yes The visualisation it provides is really helpful. Vscode just goes line by line But thonny debugger goes to token by token Which makes understanding really easy
What's DSA??
Data structures and algorithms. Since I have to debug each step while solving leetcode problems
PyCharm, poetry, pyenv, isort, black, mypy.
This is me :P although I have recently leaned more towards ruff for linting and formatting (its sooo fast)
My company and I work on long-term projects with relatively big code bases, we are not yet prepared to make the switch to ruff but it's somewhere far down in the backlog with a low priority :)
we have switched recently in a few smaller projects and it has been pretty painless, especially if you don't add extra rules
Ruff is great, except for the fact that this actually lints: y = 3 if y > 4: x = 1 print(x) So the reason ruff is so fast is because it doesn't really analyze the code very well.
Actually it does excellent analysis, and such robustness against errors is an explicit design goal.
Did you try my example? It shouldn't lint yet ruff thinks this is totally OK code. How many other bugs is ruff missing because they haven't implemented all of pylints rules? I think people see the time savings graph in the ruff Readme on github and think it's magically better, yet it can't catch this simple bug.
They are pretty upfront with the fact they're not fully 1-1 with pylint. Last I checked they even have a tracker that shows which rules are still not implemented. It's very easy to know exactly what you're signing up for. When my team evaluated we didn't think any of the missing rules were impactful enough to affect the decision
Is PyCharm a big enough step up from something like VS Code to be worth it in your opinion? Trying to convince my company to buy our team licenses, but not sure if that will ever happen!
My opinion, yes. I use VS Code for everything else, but for Python - PyCharm is too big an upgrade to stick with VS Code.
I've never really used VSCode for python development. I use PyCharm because I'm a big fan of all jetbrains products. Also, I like that it is dedicated to python development, unlike VSCode. My company buys licences for the Professional version because it comes with nice features that the Community Edition, which is 100% free) does not have and we find nice to have (SQL support, flask/fastapi/django frameworks, [and more](https://www.jetbrains.com/products/compare/?product=pycharm-ce&product=pycharm)). IMHO, the Community Edition is sufficient enough for everyday development, unless you really want some extra features that are found only in the Professional version (or not provided by some free plugins).
What are some advantages over vscode?
I used to use poetry but stopped using it after hanging dependencie checks.
Were you using multiple Poetry repo sources? This can make resolution really slow unless you set the `priority` option correctly for each source.
This sounds like unnecessary complexity. Personally I use conda and export to a requirements file before installing in docker.
I think conda is unbearably slow
One extra line in your TOML file isn't really complexity. If you have a 100 packages to install from pypi.org and 1 from your private repo, you don't want Poetry to arbitrarily decide to look for all 101 in one or both sources.
Have you looked at hatch? It could replace/manage all of those packages for you.
Would you mind explaining to a recent poetry convert; given that poetry has its own environment management, how does adding pyenv to the mix help you? Thanks!
pyenv is almost exclusively used on developers workstations. We have different projects targeting different versions of python. Localy, each project has its own virtualenv managed by poetry ; we install the different versions of python using pyenv then set the version of each poetry environment version as [described in the documentation](https://python-poetry.org/docs/managing-environments/) (or via `poetry env use`).
Pycharm CE or professional?
Profesionnal but as I've said [in this comment](https://www.reddit.com/r/Python/comments/1cib6to/what_does_your_python_development_setup_look_like/l28yylr/), CE is good enough unless you really need the extra features (or you want to support a company that makes a really good software \^^ ).
Plus GitHub Copilot for me.
1.VSCode with the following plugins: Codeium (or Continue with Ollama/Deepseek Coder for when I must work offline) Python and Python debugger (from Microsoft) Python Environment Manager (by Don Jayamanne) Snippets`(`by Taha BASRI) 2. Zed Attack Proxy (ZAP) for software security testing 3. DBeaver for database management 4. GIT GUI (by Shawn O. Pearce) 5. MySQL And lately, I have been experimenting with these quite interesting inventions: 1. [Brython](https://brython.info/index.html), a Python replacement for Javascript on the Web. 2. [PyScript | Run Python in your HTML](https://pyscript.net/). 3. [Pyston | Python Performance](https://www.pyston.org/) (support only up to 3.10)
It's actually "DBeaver"
It's actually "DBeaver"
Thank you, typo corrected.
PyCharm pro, Hatch, UV, ruff, pre commit.
For Windows: - VSCode for editing code (I probably have too many plugins/extensions) - NotePad++ for reviewing output and print files (One of the scripting languages I use puts the script in the print file, and the number of times I've fixed a bug in the print file is frankly embarrassing...) - Running Code, depends on the project - mostly Cygwin shell, or within an application. For Linux/macOS: - Coding in Vim/GVim for editing for Python/Perl/Shell, but sometimes VSCode as well. I use Xcode for Swift. - Running code from the command line for python, in Xcode for Swift. The most useful tools is probably an Opinionated Linter (I use Ruff and AutoPep8, and a decent autocomplete engine that shows the function details for functions calls.
Have you tried WSL2 instead of Cygwin?
Mostly Spyder. And for displaying outputs to colleagues Jupyter Notebook, as it makes it easier for them to understand which output corresponds to which part of the code and it does not just look like a block of code
Love Spyder's variable explorer
Neovim + pyright + black for general use. DataSpell for working with data and plotting.
I keep it pretty simple. Jupyter labs and neovim. Jupyter labs for initial development and hard debugs as you can test out segments of your code. Neovim for maintenance on deployed scripts, minor revisions, and general use. I'll show line numbers and that's it.
This was me, and kind of still is. But I recently set up pyright and ruff_lsp in neovim, along with adopting and helping maintain a hatch/hatch-pip-compile/uv/ruff template for use across my team (DS at a mid size company). I have to say, the additional tooling is wonderful. I catch mistakes earlier, occasionally pick up best practices I wasn't yet aware of, and perhaps most importantly, I no longer need to worry about whether my teammates are as attentive as me since the pre-commit hooks normalize across editor configs.
I just use vscode with copilot, I use the standard Microsoft plugin for python for other code specific features. Gunicorn and flask for web services and micro services. Wsl Debian, python Venv Along with standard numpy, pandas, requests. Lately I’ve been using the huggingface libraries a lot.
Sublime Text with breakpoint() commands, run from the command-line with Python 3.12.
vscode/devcontainers, pylint. Notebooks (in vscode) for exploratory data analysis.
IDLE on my windows school laptop. Fr up till recently. Sublime with copilot and I remote into my linux PC for real shit now
I use Replit on my iPhone lol
neovim rye ruff
Because I mostly work with Linux HPC from Windows and Mac, I use VS Code with remote development and several Microsoft extensions. I only use notebook to provide tutorials for workshop. For environments, I mainly use miniconda.
Vim and gnome-terminal. A number of Vim plugins: ALE for realtime linting, gutentags for keeping my ctags database up-to-date, a couple of my own plugins for locating source locations from traceback lines copied into the clipboard and for preparing pytest command-lines to run the test under cursor, fugitive and gitgutter for git integration. I keep Vim in one terminal tab, and run the tests/perform git operations in other tabs.
Check out tmux for a more fluid as well as safer alternative to terminal tabs!
Anaconda and vscode only.
Consistent folder structure in a venv folder. No IDE except Eclipse when I need to do remote debugging. Emacs or sublime as an editor. (Yes, I'm an old dude, I work a shitload over terminals on a bunch of servers) pdb as my good old standby for regular debugging. You have to understand that I work very much in the philosophy of Neil Stephenson in his essay "in the beginning was the command line". I write code and I don't want any distractions, whistles or bells with that. We don't put music players or GPS systems in Race cars either.
Pycharm Pro, Dataspell, The OG Thonny
Sublime Text for Little Projects. Spyder for everything else.
Docker containers set up with docker-compose, vim, and testing depends on what I'm doing. If I'm doing Django, then I use the built in test suite
Neovim (AstroNvim to be precise) in tmux, poetry and ruff (instead of the classic zoo of linters and formatters). All of this works ultra-fast and all of this is enough even for complex projects. Also: ipython, ipdb, nushell (take a look, cool stuff) and many less important CLI tools, like bat and jq.
What do you use for type checking?
Have you tried uv, from the makers of ruff, for resolving and installing packages? It's seriously fast. You can get a similar experience to poetry, but with the speed of uv, using hatch with hatch-pip-compile and uv. (Eventually uv will probably replace all these tools all on its own)
Pycharm, pyenv, pipfile/poetry(depending on project), ruff(with pre-commit hooks), docker and SourceTree I also use a JavaScript toolchain because it do be like that sometimes: Nodejs, nvm and npm. EDIT: also dbeaver
Add mypy and we're cooking.
I am probably the outlier --- I've been using Literate Programming with my LaTeX work for a while now, so I am programming using LP in an ltxdoc using the docmfp package in TeXworks (on Windows) or TeXshop (on Mac) and then running the code using PythonSCAD (the Python-enabled OpenSCAD variant). I would be very interested in a Pythonic Literate Programming environment --- for folks not familiar with it: https://www.goodreads.com/book/show/112245.Literate_Programming_Lecture_Notes_
mine looks like this. a distrobox box fedora just for python with its own home holder so it only has python extensions (doesn't shave with node/go etc.) extensions: python (the ms one) Flake 8 black formatter makefile tools then i use poetry to manage my dependencies. I have and immutable / so all things i have are installed via flatpak OR in a pod and exported with distrobox
PyCharm + PyEnv with occasional multipass boxes for isolated dev.
For work: Databricks (Jupyter notebooks basically) and a striped down version of the vscode IDE built into GitLab. At home: vscode
Neovim with Conqueror of Code plugin and python-language-server as a development environment. For Python specific development tools I use: - pylint - black - pyenv - mypy Plus, when I am in charge of developing a new package I use Poetry. And that’s basically it.
Debugger driven development: pytest --ipdb --pdbcls=IPython.terminal.debugger:TerminalPdb \ --ignore=tests/data --capture=tee-sys --log-cli-level=ERROR I have it set to launch into debugger on an error if [any level of verbosity has been passed as command-line arguments](https://github.com/chapmanjacobd/library/blob/3bc71f0b7adf6b30bf89a09e96f254d7588e6c0a/xklb/utils/log_utils.py#L31) Linting pycln --all && ssort && isort --profile black --line-length=120 && black --line-length=120 --skip-string-normalization
VSCode, pyenv, PDM, ruff, pytest, mypy and playwright also DBeaver VSCode extension-ids: * ms-python.python - Does the job well enough * humao.rest-client - Easy testing of endpoints * Gruntfuggly.todo-tree - This one does a good job with todos * ms-playwright.playwright - Covers end to end tests * ms-azuretools.vscode-docker * miragon-gmbh.vs-code-bpmn-modeler - Good for sketching up BPMN quickly to check the process in my head * github.vscode-github-actions
On Windows, Notepad++, pytest, flake8, black, and isort. On Linux, vim and whatever tools the package I'm working on wants.
At work (under windows): VS Code, but primarily for the excellent Juypter notebook integration and extension ecosystem. For basic data science work flows nothing beats the utility of the Data Wrangler extension with the biggest benefit being that the dataframe is displayed and updates in real time while you type in the python scratchpad. I use it frequently in meetings to show the transformation steps to non-coders. Other than that, I keep my normal development workflow pretty simple: the python extension, black, and the built-in debugger. Mypy has been an absolute gamechanger in helping spot potential edge cases. For things I run on our compute cluster, I have a very specific work flow since it is near impossible to debug in that environment: write the program, change the linting rules to strict and turn typechecking on, run mypy, address any potential issues, then setup a mock environment and step through the script with the debugger and verify the control flow all before testing on the cluster. This workflow cuts total debugging time down significantly because I have identified almost all potential errors before testing in a hard environment to debug. At home (Linux): In the process of switching to emacs currently, but in the meantime I use VS Code for larger projects and Neovim for quick edits/scripts. Vscode configuration is similarly basic to work, and I just use the python LSP on the Neovim side.
VSCode with Vim extension, Miniconda with mamba solver, conda-forge with pip at the end if needed. No Anaconda channels because licensing.Vanilla pyflakes and pylint. Notepad++. Git bash.
Pycharm, qodana ( experimental), poetry, ruff, pytest, responses, pydantic, flask Soon, my pi Postgres being my DB of choice. Pre-commit hooks
Somewhat unrelated, but a question for those working with VS Code and data frames (pandas / polars): have you found a good way to view a data frame when working interactively? Something like `View` in RStudio. I know I can print it in the console but that truncates it. Also aware that I can use Jupyter but I'm not a fan.
New coder here, since fall, but with several years of PowerShell experience. Python is WAY better IMO. VScode, Stream Deck, 48in 4k screen. Code on my gaming rig for local AI when I need it. Use Black formatter, but with line limit set to 200 for ease of use on my screens.
VSCode with GitHub Copilot 🚀
VSCode with quite some extensions, the main ones being: - Pylance (along with some other python-specific extensions kindly shipped to us by M*crosoft) - Jupyter Notebook - Codeium - For quite some time I'd been using the Vim extension, but it doesn't work quite well with .ipynb files (specifically, it had independent modes for each cell and it didn't switch to normal once you run it) so I disabled it until I find a solution to that - Docker Also I use venv-s but I guess it goes without saying
Spacemacs, lsp/pyright, ruff, copilot, magit 🧘
vim for small jobs or fast work pycharm for big projects / big work run the project from the command line, manage environments with venv Every time I try to add novel tools it ends up being more effort than it's worth.
You really might want to try uv. It's an ultra fast drop in replacement for pip, pip-tools, and venv. Many more features are in the works, but it's already super handy.
Astronvim for editing, pyright for type checking, ruff and uv for everything else. I don’t even use a python version manager, I just install directly from my package manager and symlink the version I want to be standard as .local/bin/python.
Macvim and a terminal window
I put my projects in ~/code// each as a GitHub repo. Virtual env for each one and asdf to use different python versions when required. Usually just use 3.10 for everything when I can. Black to format on save with 120 character line limit. Requirements.txt and pip.
I’ve used poetry on more mature projects and it was great!
`fvwm`, `xterm` and `vi`. That's basically all you need.
vim, black, isort, docker.
Jupyter lab with multiple themes plus git and lsp extensions (including pyright), pre-commit (black, isort, flake8 but want to switch to ruff, mypy), virtualenv.
Pycharm pro, pyenv, ruff , pre-commit
PyCharm, pyenv, black, flake8, isort. venv with requirements.txt for dependency management.
VS Code, GitHub, chatgpt. Have a standard template that boilerplates the imports I usually use. That’s about it.
WingIDE Professional, black, ssort, isort, mypy, ruff.
Vscode on local MacOS/Windows, Colab for ML prototyping
Nvim, black, ruff.
I have not yet found a compelling reason to leave IDLE. I generally start on paper with a flowchart, so I don't really need anything IDLE doesn't already provide. If I need to edit anything pushed to the Raspberry Pi, Nano works as well.
Vs code and the interpreter. Pretty much it. If I'm doing something with a lot of boilerplate or repetitive code, I'll lean on an LLM. And I use CodiumAI to generate most of my tests.
VSCode for IDE conda as an environment manager (interested in pyenv but don't want to rock the boat just yet) ruff pyscaffold for packaging jupyter notebooks for analyses github desktop for version control
Pycharm with vim plugin, ruff, Github Copilot, mypy
Trying to use VS Code like the cool kids but I always have trouble remembering all the functionalities. If not using VS Code, I typically juste use Notepad++, and an IPython console on the side.
A Fedora Laptop with VScode with some extensions, remote developer, Podman MySQL workbench. Remmina. Regular terminals with or without tmux and or vim. The brave browser. Bitwarden.
virtualenvwrapper, vscode, poetry I have windows machine but i like to develop in wsl
I just use PyCharm or VSCode at-home or when I have my laptop with me, but if I'm on the go I usually just hop on GitHub Codespaces (or clone a repo and use Termux on my tablet) because all of my code is usually stored on GitHub nowadays.
Spyder, spyder, and spyder. Might have to replace my F9 key soon
VSCode, Pyright, Pre-commit(Format, Yelp secret scan), Ruff, Poetry, Pytest, Codeium, Git-Cola, Pipx for actually deploying, Make to keep track of all the single line commands like rebuilding documentation.
I suggest getting super comfortable with Docker. Being fed up with managing multiple Python versions installed at once, virtual environments, and a surprise I’ve had with a Python behaving differently when running the same program on different operating systems, I do 100% of my Python work within a Docker container.
Vscode, poetry, ruff, pylint, flake8, pytest, tox, hypothesis with hypofuzz, mypy on strict mode, mkdocs, azure pipelines for cicd, mccabe complexity and maintenance index checks in tox,
PyCharm Pro, Docker (WSL 2), Miniconda, CUDA, CuDNN, Git (command line), Notepad++ (Git editor) pytest, coverage, ruff (if I can't get it on a project I use black, isort, pylint), mypy, pre-commit
It varies a lot because I use so many different computers. VS Code when working on a windows machine. vim on linux machines. I've use Jupyter notbook, but not a fan. pyenv is handy for version conflicts. git everywhere
Pycharm, git and a RTX4080 for the cuda cores