Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: can only concatenate str (not "NoneType") to str #48

Open
jtresko opened this issue Dec 23, 2023 · 5 comments
Open

TypeError: can only concatenate str (not "NoneType") to str #48

jtresko opened this issue Dec 23, 2023 · 5 comments

Comments

@jtresko
Copy link

jtresko commented Dec 23, 2023

I'm getting this on pretty much any command. Even when going from python to python to try having it debug, test, etc or convert to different libraries with the --guidelines.

Here's the full console output (note this is an example with an obscure language, but getting the same on py to py):

 ➜ /workspaces/ODB/gpt-migrate-main/gpt_migrate (main) $ python main.py --sourcelang EasyLanguage --sourcedir /workspaces/ODB/Python/ELtoPy/ELSrc --sourceentry OD_23.02-Strategy.el --guidelines "You specialize in trading algorithms, you are an expert in converting TradeStation EasyLanguage code to Python using the Lumibot library" --targetdir /workspaces/ODB/Python/ELtoPy/PyTgt --targetlang python 

◐ Reading EasyLanguage project from directory '/workspaces/ODB/Python/ELtoPy/ELSrc', with entrypoint 'OD_23.02-Strategy.el'.
◑ Outputting python project to directory '/workspaces/ODB/Python/ELtoPy/PyTgt'.
Source directory structure: 

        ├── OD Strat Mashup.el
        ├── ELtoPySrc/
            │   └── OD_23.02-Strategy_GH-CoPilot1.py
        ├── OD_23.02-Strategy.el
        ├── OD_23.02-Strategy copy.el
        └── OD_23.02-Strategy_GH-CoPilot1.py

✅  Creating your environment...
Created Docker environment for python project in directory '/workspaces/ODB/Python/ELtoPy/PyTgt'.
Traceback (most recent call last):

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/main.py", line 127, in <module>
    app()

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/main.py", line 100, in main
    migrate(sourceentry, globals)

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/main.py", line 94, in migrate
    internal_deps_list, external_deps_list = get_dependencies(sourcefile=sourcefile,globals=globals)

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/steps/migrate.py", line 58, in get_dependencies
    external_dependencies = llm_run(prompt,

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/utils.py", line 39, in llm_run
    output = globals.ai.run(prompt)

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/ai.py", line 49, in run
    chat += msg

TypeError: can only concatenate str (not "NoneType") to str

Another example:

➜ /workspaces/ODB/gpt-migrate-main/gpt_migrate (main) $ python main.py --sourcelang python --sourcedir /workspaces/ODB/Python/ELtoPy/ELSrc/ELtoPySrc --sourceentry OD_23.02-Strategy_GH-CoPilot1.py --targetdir /workspaces/ODB/Python/ELtoPy/PyTgt --targetlang python 
◐ Reading python project from directory '/workspaces/ODB/Python/ELtoPy/ELSrc/ELtoPySrc', with entrypoint 'OD_23.02-Strategy_GH-CoPilot1.py'.
◑ Outputting python project to directory '/workspaces/ODB/Python/ELtoPy/PyTgt'.
Source directory structure: 

        └── OD_23.02-Strategy_GH-CoPilot1.py

✅  Creating your environment...
Created Docker environment for python project in directory '/workspaces/ODB/Python/ELtoPy/PyTgt'.
Traceback (most recent call last):

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/main.py", line 127, in <module>
    app()

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/main.py", line 100, in main
    migrate(sourceentry, globals)

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/main.py", line 94, in migrate
    internal_deps_list, external_deps_list = get_dependencies(sourcefile=sourcefile,globals=globals)

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/steps/migrate.py", line 58, in get_dependencies
    external_dependencies = llm_run(prompt,

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/utils.py", line 39, in llm_run
    output = globals.ai.run(prompt)

  File "/workspaces/ODB/gpt-migrate-main/gpt_migrate/ai.py", line 49, in run
    chat += msg

TypeError: can only concatenate str (not "NoneType") to str
@MBaldo83
Copy link

Hey @jtresko I was wondering if you ever found a fix for this? I'm just trying to get the migration up and running, and I'm getting this as well. Is there a known workaround?
Thanks in advance :-)

@jtresko
Copy link
Author

jtresko commented Jan 28, 2024

No, I've sorta abandoned this one. Was just curious about the capability. Seems a critical piece is to make sure TreeSitter has definitions for both languages. Not sure if that's where this error came from.

It worked when I tested something like python to Nodejs I believe.

@mjroeleveld
Copy link

Also getting this error. This repo seems useless now. @joshpxyne FYI

@windowshopr
Copy link

Yeah I've got the same issue, trying to run ollama/llama3 model on Windows 10, Python 3.11

│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\main.py:100 in main                         │
│                                                                                                  │
│    97 │   │   │   file_name = write_migration(sourcefile, external_deps_list, target_deps_per_   │
│    98 │   │   │   target_deps_per_file[parent_file].append(file_name)                            │
│    99 │   │                                                                                      │
│ > 100 │   │   migrate(sourceentry, globals)                                                      │
│   101 │   │   add_env_files(globals)                                                             │
│   102 │                                                                                          │
│   103 │   ''' 3. Testing '''                                                                     │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │                         ai = <ai.AI object at 0x00000210E5A7CD10>                            │ │
│ │          detected_language = None                                                            │ │
│ │                    globals = <__main__.Globals object at 0x00000210E5976BD0>                 │ │
│ │                 guidelines = ''                                                              │ │
│ │                    migrate = <function main.<locals>.migrate at 0x00000210E3311BC0>          │ │
│ │                      model = 'ollama/llama3'                                                 │ │
│ │           operating_system = 'linux'                                                         │ │
│ │ source_directory_structure = '        ├── .github/\n            │   ├── ISSUE_TEMPLATE/\n    │ │
│ │                              │   │  '+2224752                                                │ │
│ │                  sourcedir = 'C:\\Users\\chalu\\Desktop\myapp17.0'                          │ │
│ │                sourceentry = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py'       │ │
│ │                 sourcelang = 'python'                                                        │ │
│ │                 sourceport = None                                                            │ │
│ │                       step = 'all'                                                           │ │
│ │       target_deps_per_file = defaultdict(<class 'list'>, {})                                 │ │
│ │                  targetdir = 'C:\\Users\\chalu\\Desktop\\myappGolang'                         │ │
│ │                 targetlang = 'golang'                                                        │ │
│ │                 targetport = 8080                                                            │ │
│ │                temperature = 0.0                                                             │ │
│ │                  testfiles = 'app.py'                                                        │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\main.py:94 in migrate                       │
│                                                                                                  │
│    91 │   │   target_deps_per_file = defaultdict(list)                                           │
│    92 │   │   def migrate(sourcefile, globals, parent_file=None):                                │
│    93 │   │   │   # recursively work through each of the files in the source directory, starti   │
│ >  94 │   │   │   internal_deps_list, external_deps_list = get_dependencies(sourcefile=sourcef   │
│    95 │   │   │   for dependency in internal_deps_list:                                          │
│    96 │   │   │   │   migrate(dependency, globals, parent_file=sourcefile)                       │
│    97 │   │   │   file_name = write_migration(sourcefile, external_deps_list, target_deps_per_   │
│                                                                                                  │
│ ┌───────────────────────────────────── locals ─────────────────────────────────────┐             │
│ │              globals = <__main__.Globals object at 0x00000210E5976BD0>           │             │
│ │              migrate = <function main.<locals>.migrate at 0x00000210E3311BC0>    │             │
│ │          parent_file = None                                                      │             │
│ │           sourcefile = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py' │             │
│ │ target_deps_per_file = defaultdict(<class 'list'>, {})                           │             │
│ └──────────────────────────────────────────────────────────────────────────────────┘             │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\steps\migrate.py:58 in get_dependencies     │
│                                                                                                  │
│    55 │   │   │   │   │   │   │   │   │   │   │   │   │   sourcelang=globals.sourcelang,         │
│    56 │   │   │   │   │   │   │   │   │   │   │   │   │   sourcefile_content=sourcefile_conten   │
│    57 │                                                                                          │
│ >  58 │   external_dependencies = llm_run(prompt,                                                │
│    59 │   │   │   │   │   │   │   waiting_message=f"Identifying external dependencies for {sou   │
│    60 │   │   │   │   │   │   │   success_message=None,                                          │
│    61 │   │   │   │   │   │   │   globals=globals)                                               │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │ external_deps_prompt_template = 'The following prompt is a composition of prompt sections,   │ │
│ │                                 each with different pr'+1799                                 │ │
│ │                          file = <_io.TextIOWrapper                                           │ │
│ │                                 name='C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.… │ │
│ │                                 mode='r' encoding='cp1252'>                                  │ │
│ │                       globals = <__main__.Globals object at 0x00000210E5976BD0>              │ │
│ │ internal_deps_prompt_template = 'The following prompt is a composition of prompt sections,   │ │
│ │                                 each with different pr'+2207                                 │ │
│ │                        prompt = 'The following prompt is a composition of prompt sections,   │ │
│ │                                 each with different pr'+1775                                 │ │
│ │                    sourcefile = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py'    │ │
│ │            sourcefile_content = 'from .cli.command import main\n\nmain()\n'                  │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\utils.py:39 in llm_run                      │
│                                                                                                  │
│    36 │                                                                                          │
│    37 │   output = ""                                                                            │
│    38 │   with yaspin(text=waiting_message, spinner="dots") as spinner:                          │
│ >  39 │   │   output = globals.ai.run(prompt)                                                    │
│    40 │   │   spinner.ok("✅ ")                                                                  │
│    41 │                                                                                          │
│    42 │   if success_message:                                                                    │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │         globals = <__main__.Globals object at 0x00000210E5976BD0>                            │ │
│ │          output = ''                                                                         │ │
│ │          prompt = 'The following prompt is a composition of prompt sections, each with       │ │
│ │                   different pr'+1775                                                         │ │
│ │         spinner = <Yaspin frames=⠋⠙⠹⠸⠼⠴⠦⠧⠇⠏>                                                 │ │
│ │ success_message = None                                                                       │ │
│ │ waiting_message = 'Identifying external dependencies for                                     │ │
│ │                   C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__ma'+10                       │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\ai.py:49 in run                             │
│                                                                                                  │
│   46 │   │   for chunk in response:                                                              │
│   47 │   │   │   delta = chunk["choices"][0]["delta"]                                            │
│   48 │   │   │   msg = delta.get("content", "")                                                  │
│ > 49 │   │   │   chat += msg                                                                     │
│   50 │   │   return chat                                                                         │
│   51                                                                                             │
│   52                                                                                             │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │     chat = 'logrus,github.com/spf13/cobra,github.com/google/go-cmp/matchers,github.com/goog… │ │
│ │    chunk = ModelResponse(                                                                    │ │
│ │            │   id='chatcmpl-0022f59d-3cf6-4202-8205-3dfbe33133a1',                           │ │
│ │            │   choices=[                                                                     │ │
│ │            │   │   StreamingChoices(                                                         │ │
│ │            │   │   │   finish_reason='stop',                                                 │ │
│ │            │   │   │   index=0,                                                              │ │
│ │            │   │   │   delta=Delta(                                                          │ │
│ │            │   │   │   │   content=None,                                                     │ │
│ │            │   │   │   │   role=None,                                                        │ │
│ │            │   │   │   │   function_call=None,                                               │ │
│ │            │   │   │   │   tool_calls=None                                                   │ │
│ │            │   │   │   ),                                                                    │ │
│ │            │   │   │   logprobs=None                                                         │ │
│ │            │   │   )                                                                         │ │
│ │            │   ],                                                                            │ │
│ │            │   created=1715994803,                                                           │ │
│ │            │   model='llama3',                                                               │ │
│ │            │   object='chat.completion.chunk',                                               │ │
│ │            │   system_fingerprint=None                                                       │ │
│ │            )                                                                                 │ │
│ │    delta = Delta(content=None, role=None, function_call=None, tool_calls=None)               │ │
│ │  message = [                                                                                 │ │
│ │            │   {                                                                             │ │
│ │            │   │   'role': 'user',                                                           │ │
│ │            │   │   'content': 'The following prompt is a composition of prompt sections,     │ │
│ │            each with different pr'+1775                                                      │ │
│ │            │   }                                                                             │ │
│ │            ]                                                                                 │ │
│ │      msg = None                                                                              │ │
│ │   prompt = 'The following prompt is a composition of prompt sections, each with different    │ │
│ │            pr'+1775                                                                          │ │
│ │ response = <generator object ollama_completion_stream at 0x00000210E5A38460>                 │ │
│ │     self = <ai.AI object at 0x00000210E5A7CD10>                                              │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
└──────────────────────────────────────────────────────────────────────────────────────────────────┘
TypeError: can only concatenate str (not "NoneType") to str

@hannesaasamets
Copy link

hannesaasamets commented Aug 12, 2024

It can be sidestepped in ai.py on line 49 by adding or "" to chat += msg

chat += msg or ""

Commented here:
bfb06d6#r145282262

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants