During the automated playtest of the template utilizing real user interactions, the following issues and observations were recorded:
-
Missing Dependencies for Initialization Script:
- The script
init_project.pytries to importyamlbut fails becausePyYAMLandpython-dotenvare not installed by default in the environment beforeinit_project.pyexecutes them. A workaround was to manually runpip install PyYAML python-dotenvbefore running the initialization, which breaks the promised zero-dependency start.
- The script
-
Gitingest Missing Error:
- At the end of the initialization flow, there is a critical failure notice:
❌ CRITICAL: gitingest not found. Memory updates disabled. Please install via pip.This creates a suboptimal onboarding experience. The initialization should probably auto-install this or handle the absence more gracefully since the user is not warned about this requirement.
- At the end of the initialization flow, there is a critical failure notice:
-
Missing
jsonschemaDependency:- When attempting to run the engine for the first time via
./squad, the script immediately failed withError importing modules: No module named 'jsonschema'. The dependency is missing from the required setup sequence.
- When attempting to run the engine for the first time via
-
Hardcoded Directory Assumptions in
core/context.py:- The
ContextLoaderhas a method_find_root()which heavily relies on the execution happening fromsrc/core/context.py. When running thesquadcommand, theContextLoaderassumes the.agentsfolder is in a specific hierarchical structure that no longer aligns with how the squad wrapper invokesmain.pydirectly from.agents/engine/main.py. This caused aFileNotFoundErrorwhen trying to locate the persona configuration files. A manual code patch incontext.pywas needed to continue.
- The
-
Method Signature Mismatch in
core/main.py:- Inside
generate_llm_graph(),provider.generate(prompt)is called. However, the abstractLLMProvider.generate()requires two arguments:system_promptanduser_prompt. This mismatch throws aTypeError: GeminiProvider.generate() missing 1 required positional argument: 'user_prompt'preventing graph generation.
- Inside
-
API Limits Hit:
- During the Gemini execution test, the API key provided quickly threw a
429 RESOURCE_EXHAUSTEDerror due to free-tier limits being hit. Though this is specific to the API key rather than a code defect, it highlights that rate-limit handling and graceful retries/fallback might be needed for the LLM Provider integration to survive quota bumps seamlessly.
- During the Gemini execution test, the API key provided quickly threw a
- The interactive prompts worked well in detecting "Integration Mode" correctly based on the directory contents.
- The Git endpoint security bypass for "Integration Mode" works correctly.
- Adjust init_project.py to ensure PyYAML, python-dotenv, and gitingest are installed via subprocess (followed by importlib.invalidate_caches()) before attempting imports or execution loops.
- Add
jsonschemato the list of core dependencies installed during the template initialization. - Refactor ContextLoader._find_root() in context.py to correctly calculate the root directory robustly and fix the persona path mismatch (init_project.py uses config/ while context.py expects config/defaults/).
- Fix the generate() method call inside main.py to pass the correct arguments matching the LLMProvider signature by separating system instructions and the user task description.