Skip to content

Conversation

@Elbehery
Copy link
Contributor

@Elbehery Elbehery commented Aug 7, 2025

What does this PR do?

This PR renames categories of llama_stack loggers.

This PR aligns logging categories as per the package name, as well as reviews from initial #2868. This is a follow up to #3061.

Replaces #2868
Part of #2865

cc @leseb @rhuss

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Aug 7, 2025
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch 2 times, most recently from 50b5ffe to 91cd285 Compare August 8, 2025 06:33
@Elbehery Elbehery closed this Aug 8, 2025
@Elbehery Elbehery deleted the 20250807_rename_logger_categories branch August 8, 2025 09:34
@Elbehery Elbehery restored the 20250807_rename_logger_categories branch August 8, 2025 09:35
@Elbehery Elbehery reopened this Aug 8, 2025
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 91cd285 to 96fbb8e Compare August 8, 2025 09:43
@Elbehery Elbehery changed the title chore: rename log categories refactor(logging): rename llama_stack logger categories Aug 8, 2025
@rhuss
Copy link
Contributor

rhuss commented Aug 11, 2025

How have you checked the file base for missing alignment of the log category with the package name? Can you please show me your vibe coding context & prompt?

When I use

LLS uses statements like 
`get_logger(name=__name__, category="core")` to specify a category. 
This category aligns with the package name 
(e.g., here it's the directory `llama_stack/core/server/auth_providers.py` that uses the
top-level dir as category name. Show me all places where
this alignment doesn't match.

I find quite a few more spots (actually 27 instead of 5). Can you check again, please? Also, can you please specify which "rule" you used to map the package name (file path)? I'm assuming you always use the first path component, but it sometimes seems that the category is randomly picked from* some* path component of the Python file.

It should be consistent and unambiguous, so that there is a clear rule also for the future, which category name to pick.

File Expected Actual Notes
llama_stack/core/utils/config_resolution.py core config_resolution  
llama_stack/core/server/server.py core server 2 occurrences
llama_stack/core/server/quota.py core quota  
llama_stack/core/server/auth_providers.py core auth  
llama_stack/core/server/auth.py core auth  
llama_stack/providers/utils/telemetry/tracing.py providers core  
llama_stack/providers/remote/inference/ollama/ollama.py providers inference  
llama_stack/providers/inline/telemetry/meta_reference/console_span_processor.py providers telemetry  
llama_stack/providers/utils/inference/litellm_openai_mixin.py providers inference  
llama_stack/providers/inline/agents/meta_reference/agent_instance.py providers agents  
llama_stack/providers/utils/tools/mcp.py providers tools  
llama_stack/providers/utils/sqlstore/authorized_sqlstore.py providers authorized_sqlstore  
llama_stack/providers/remote/tool_runtime/model_context_protocol/model_context_protocol.py providers tools  
llama_stack/providers/remote/inference/together/together.py providers inference  
llama_stack/providers/remote/inference/fireworks/fireworks.py providers inference  
llama_stack/providers/utils/inference/model_registry.py providers core  
llama_stack/providers/remote/inference/vllm/vllm.py providers inference  
llama_stack/providers/utils/inference/openai_mixin.py providers core  
llama_stack/providers/inline/inference/meta_reference/inference.py providers inference  
llama_stack/providers/utils/sqlstore/sqlalchemy_sqlstore.py providers sqlstore  
llama_stack/providers/inline/agents/meta_reference/openai_responses.py providers openai_responses  
llama_stack/providers/utils/inference/prompt_adapter.py providers inference  
llama_stack/providers/utils/scheduler.py providers scheduler  
llama_stack/providers/inline/post_training/torchtune/common/checkpointer.py providers uncategorized get_logger("DEBUG") uses default category
llama_stack/cli/stack/run.py cli server  
llama_stack/models/llama/tokenizer_utils.py models tokenizer_utils  
llama_stack/models/llama/llama3/tool_utils.py models inference  
File Expected Actual Notes
llama_stack/core/utils/config_resolution.py core config_resolution
llama_stack/core/server/server.py core server 2 occurrences
llama_stack/core/server/quota.py core quota
llama_stack/core/server/auth_providers.py core auth
llama_stack/core/server/auth.py core auth
llama_stack/providers/utils/telemetry/tracing.py providers core
llama_stack/providers/remote/inference/ollama/ollama.py providers inference
llama_stack/providers/inline/telemetry/meta_reference/console_span_processor.py providers telemetry
llama_stack/providers/utils/inference/litellm_openai_mixin.py providers inference
llama_stack/providers/inline/agents/meta_reference/agent_instance.py providers agents
llama_stack/providers/utils/tools/mcp.py providers tools
llama_stack/providers/utils/sqlstore/authorized_sqlstore.py providers authorized_sqlstore
llama_stack/providers/remote/tool_runtime/model_context_protocol/model_context_protocol.py providers tools
llama_stack/providers/remote/inference/together/together.py providers inference
llama_stack/providers/remote/inference/fireworks/fireworks.py providers inference
llama_stack/providers/utils/inference/model_registry.py providers core
llama_stack/providers/remote/inference/vllm/vllm.py providers inference
llama_stack/providers/utils/inference/openai_mixin.py providers core
llama_stack/providers/inline/inference/meta_reference/inference.py providers inference
llama_stack/providers/utils/sqlstore/sqlalchemy_sqlstore.py providers sqlstore
llama_stack/providers/inline/agents/meta_reference/openai_responses.py providers openai_responses
llama_stack/providers/utils/inference/prompt_adapter.py providers inference
llama_stack/providers/utils/scheduler.py providers scheduler
llama_stack/providers/inline/post_training/torchtune/common/checkpointer.py providers uncategorized get_logger("DEBUG") uses default category
llama_stack/cli/stack/run.py cli server
llama_stack/models/llama/tokenizer_utils.py models tokenizer_utils
llama_stack/models/llama/llama3/tool_utils.py models inference
  • 27 mismatches found.

from llama_stack.log import get_logger

logger = get_logger(name=__name__, category="auth")
logger = get_logger(name=__name__, category="core")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is the rule to extract the category ? First path element ("core") or shouldn't it be last path element ("server") ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so I initially use the first path element, but over some reviews on the original PR, I have changed them accordingly :)

from llama_stack.providers.utils.responses.responses_store import ResponsesStore

logger = get_logger(name=__name__, category="openai_responses")
logger = get_logger(name=__name__, category="agents")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which "agents" (randomly from within the package name) and not "providers" (first path element in the package) ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto :)

@rhuss
Copy link
Contributor

rhuss commented Aug 11, 2025

It's also ok to introduce a more complex rule-set like:

  • Top-level package dir in general
  • For providers: The first directory below providers/{inline,remote}, but with inline:: and remote:: prefixed, respectively.
  • ....

Please propose such a rule-set. It shouldn't be too involved, though. Try to keep it less than 5 rules.
Again it should not be too complicated, but we then can add this rule-set to the CLAUDE.md or .cursor/ prompts, so that in the future the right logging category gets automatically picked up.

@Elbehery
Copy link
Contributor Author

How have you checked the file base for missing alignment of the log category with the package name? Can you please show me your vibe coding context & prompt?

When I use

LLS uses statements like 
`get_logger(name=__name__, category="core")` to specify a category. 
This category aligns with the package name 
(e.g., here it's the directory `llama_stack/core/server/auth_providers.py` that uses the
top-level dir as category name. Show me all places where
this alignment doesn't match.

I find quite a few more spots (actually 27 instead of 5). Can you check again, please? Also, can you please specify which "rule" you used to map the package name (file path)? I'm assuming you always use the first path component, but it sometimes seems that the category is randomly picked from* some* path component of the Python file.

It should be consistent and unambiguous, so that there is a clear rule also for the future, which category name to pick.
File Expected Actual Notes
llama_stack/core/utils/config_resolution.py core config_resolution
llama_stack/core/server/server.py core server 2 occurrences
llama_stack/core/server/quota.py core quota
llama_stack/core/server/auth_providers.py core auth
llama_stack/core/server/auth.py core auth
llama_stack/providers/utils/telemetry/tracing.py providers core
llama_stack/providers/remote/inference/ollama/ollama.py providers inference
llama_stack/providers/inline/telemetry/meta_reference/console_span_processor.py providers telemetry
llama_stack/providers/utils/inference/litellm_openai_mixin.py providers inference
llama_stack/providers/inline/agents/meta_reference/agent_instance.py providers agents
llama_stack/providers/utils/tools/mcp.py providers tools
llama_stack/providers/utils/sqlstore/authorized_sqlstore.py providers authorized_sqlstore
llama_stack/providers/remote/tool_runtime/model_context_protocol/model_context_protocol.py providers tools
llama_stack/providers/remote/inference/together/together.py providers inference
llama_stack/providers/remote/inference/fireworks/fireworks.py providers inference
llama_stack/providers/utils/inference/model_registry.py providers core
llama_stack/providers/remote/inference/vllm/vllm.py providers inference
llama_stack/providers/utils/inference/openai_mixin.py providers core
llama_stack/providers/inline/inference/meta_reference/inference.py providers inference
llama_stack/providers/utils/sqlstore/sqlalchemy_sqlstore.py providers sqlstore
llama_stack/providers/inline/agents/meta_reference/openai_responses.py providers openai_responses
llama_stack/providers/utils/inference/prompt_adapter.py providers inference
llama_stack/providers/utils/scheduler.py providers scheduler
llama_stack/providers/inline/post_training/torchtune/common/checkpointer.py providers uncategorized get_logger("DEBUG") uses default category
llama_stack/cli/stack/run.py cli server
llama_stack/models/llama/tokenizer_utils.py models tokenizer_utils
llama_stack/models/llama/llama3/tool_utils.py models inference
File Expected Actual Notes
llama_stack/core/utils/config_resolution.py core config_resolution
llama_stack/core/server/server.py core server 2 occurrences
llama_stack/core/server/quota.py core quota
llama_stack/core/server/auth_providers.py core auth
llama_stack/core/server/auth.py core auth
llama_stack/providers/utils/telemetry/tracing.py providers core
llama_stack/providers/remote/inference/ollama/ollama.py providers inference
llama_stack/providers/inline/telemetry/meta_reference/console_span_processor.py providers telemetry
llama_stack/providers/utils/inference/litellm_openai_mixin.py providers inference
llama_stack/providers/inline/agents/meta_reference/agent_instance.py providers agents
llama_stack/providers/utils/tools/mcp.py providers tools
llama_stack/providers/utils/sqlstore/authorized_sqlstore.py providers authorized_sqlstore
llama_stack/providers/remote/tool_runtime/model_context_protocol/model_context_protocol.py providers tools
llama_stack/providers/remote/inference/together/together.py providers inference
llama_stack/providers/remote/inference/fireworks/fireworks.py providers inference
llama_stack/providers/utils/inference/model_registry.py providers core
llama_stack/providers/remote/inference/vllm/vllm.py providers inference
llama_stack/providers/utils/inference/openai_mixin.py providers core
llama_stack/providers/inline/inference/meta_reference/inference.py providers inference
llama_stack/providers/utils/sqlstore/sqlalchemy_sqlstore.py providers sqlstore
llama_stack/providers/inline/agents/meta_reference/openai_responses.py providers openai_responses
llama_stack/providers/utils/inference/prompt_adapter.py providers inference
llama_stack/providers/utils/scheduler.py providers scheduler
llama_stack/providers/inline/post_training/torchtune/common/checkpointer.py providers uncategorized get_logger("DEBUG") uses default category
llama_stack/cli/stack/run.py cli server
llama_stack/models/llama/tokenizer_utils.py models tokenizer_utils
llama_stack/models/llama/llama3/tool_utils.py models inference

* 27 mismatches found.

thanks for your review

so from the table, I conclude that the correct category, is the outer package name, (e.g. for llama_stack/providers/utils/inference/prompt_adapter.py, the category is provider)

@Elbehery
Copy link
Contributor Author

Elbehery commented Aug 11, 2025

It's also ok to introduce a more complex rule-set like:

* Top-level package dir in general

* For providers: The first directory below `providers/{inline,remote}`, but with `inline::` and `remote::` prefixed, respectively.

* ....

Please propose such a rule-set. It shouldn't be too involved, though. Try to keep it less than 5 rules. Again it should not be too complicated, but we then can add this rule-set to the CLAUDE.md or .cursor/ prompts, so that in the future the right logging category gets automatically picked up.

thanks for your guidance

please find my proposal below

  • Generally, use the Top-level package name.
  • For core, use the last package name. Please find examples below to clarify.
    • For llama_stack/core/utils/config_resolution.py use utils.
    • For llama_stack/core/server/server.py use server.
    • For llama_stack/providers/remote/inference/ollama/ollama.py use remote::inference::ollama.
    • For llama_stack/models/llama/llama3/tool_utils.py use llama3

ptal 👍🏽

@rhuss
Copy link
Contributor

rhuss commented Aug 11, 2025

thanks for the proposal

  • For core, use the last package name if relevant (e.g. server or auth), otherwise, fall to the top-level package name (i.e. core). Please find examples below to clarify

"if relevant" should never be part of any rule, as there is a lot of leeway in interpretation about what is actually "relevant".

You could say:

  • For all sub-packages of core/server, use "server". Otherwise, use "core" for anything else. Although I would be ok to also use the last component for everything below "core".

However, I would be equally acceptable to handpick a selection and not rely solely on the file path. For that, one should define the full list of categories. Perhaps it's better not to connect the package name, as I don't see any regularity right now that could be leveraged to fully determine the log category from the package / directory name.

@Elbehery
Copy link
Contributor Author

Elbehery commented Aug 11, 2025

"if relevant" should never be part of any rule, as there is a lot of leeway in interpretation about what is actually "relevant".

+1, I will remove it above, it should be clear to avoid confusion in the future 👍🏽

Perhaps it's better not to connect the package name, as I don't see any regularity right now that could be leveraged to fully determine the log category from the package / directory name.

+1, therefore I sometimes used the package name and other use the top-level package name.

I would appreciate input from others in order to have consensus over the categories names used now, and in the future

cc @leseb @ashwinb @mattf @nathan-weinberg @skamenan7 @derekhiggins @bbrowning @cdoern @ehhuang

@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch 3 times, most recently from 878808b to 4e5bc9a Compare August 20, 2025 08:11
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 20, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 4e5bc9a to b73b1db Compare August 20, 2025 21:39
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 20, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from b73b1db to bbcd6b0 Compare August 20, 2025 21:51
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 20, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch 2 times, most recently from f991345 to 1bad897 Compare August 21, 2025 11:36
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 21, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery
Copy link
Contributor Author

cc @mattf this is a follow up to #3061

Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 21, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 1bad897 to 7515c9c Compare August 21, 2025 22:51
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 21, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 7515c9c to c97dc1c Compare August 21, 2025 23:04
Copy link
Contributor

@ashwinb ashwinb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am good with this PR mostly I have a couple of comments inline mostly the addition of a file which feels like a mistake.

from llama_stack.providers.utils.kvstore.config import KVStoreConfig, SqliteKVStoreConfig

logger = get_logger(__name__, category="core")
logger = get_logger(__name__, category="core::store")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just name this core::registry

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed 👍🏽

)

logger = get_logger(name=__name__, category="responses")
logger = get_logger(name=__name__, category="agents::meta_reference")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should stay responses really, maybe call this openai::responses. this is why we cannot take these dictums too formally sometimes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed 👍🏽

@@ -0,0 +1,1154 @@
# Copyright (c) Meta Platforms, Inc. and affiliates.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this file added? is that a bad rebase?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, the commit is older than removing this file

thanks for catching this 👍🏽

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed 👍🏽

Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 21, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from c97dc1c to 39cc12e Compare August 21, 2025 23:10
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 21, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 39cc12e to 7a98003 Compare August 21, 2025 23:21
Elbehery added a commit to Elbehery/llama-stack that referenced this pull request Aug 21, 2025
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 7a98003 to 23ef375 Compare August 21, 2025 23:22
This PR renames logging categores of llama_stack logger across the codebase according to llamastack#3065 (comment).

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery force-pushed the 20250807_rename_logger_categories branch from 23ef375 to 430ea87 Compare August 21, 2025 23:38
@ashwinb ashwinb merged commit c3b2b06 into llamastack:main Aug 22, 2025
23 checks passed
franciscojavierarceo pushed a commit to franciscojavierarceo/llama-stack that referenced this pull request Aug 22, 2025
)

# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
This PR renames categories of llama_stack loggers.

This PR aligns logging categories as per the package name, as well as
reviews from initial
llamastack#2868. This is a follow up
to llamastack#3061.

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

Replaces llamastack#2868
Part of llamastack#2865

cc @leseb @rhuss

Signed-off-by: Mustafa Elbehery <[email protected]>
@Elbehery Elbehery deleted the 20250807_rename_logger_categories branch August 22, 2025 07:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants