slab_deactivate_memcg_cache_rcu_sched
Regular
4.4
: Absent ⚠️
4.8
: Absent ⚠️
4.10
: Absent ⚠️
4.13
: ✅void slab_deactivate_memcg_cache_rcu_sched(struct kmem_cache *s, void (*deact_fn)(struct kmem_cache *));
Collision: Unique Global
Inline: No
Transformation: False
Instances:
In mm/slab_common.c (ffffffff811e77a0)
Location: mm/slab_common.c:678
Inline: False
Direct callers:
- mm/slub.c:__kmemcg_cache_deactivate
Symbols:
ffffffff811e77a0-ffffffff811e7801: slab_deactivate_memcg_cache_rcu_sched (STB_GLOBAL)
4.15
: ✅void slab_deactivate_memcg_cache_rcu_sched(struct kmem_cache *s, void (*deact_fn)(struct kmem_cache *));
Collision: Unique Global
Inline: No
Transformation: False
Instances:
In mm/slab_common.c (ffffffff811fd9e0)
Location: mm/slab_common.c:687
Inline: False
Direct callers:
- mm/slub.c:__kmemcg_cache_deactivate
Symbols:
ffffffff811fd9e0-ffffffff811fda43: slab_deactivate_memcg_cache_rcu_sched (STB_GLOBAL)
4.18
: ✅void slab_deactivate_memcg_cache_rcu_sched(struct kmem_cache *s, void (*deact_fn)(struct kmem_cache *));
Collision: Unique Global
Inline: No
Transformation: False
Instances:
In mm/slab_common.c (ffffffff8121ed10)
Location: mm/slab_common.c:713
Inline: False
Direct callers:
- mm/slub.c:__kmemcg_cache_deactivate
Symbols:
ffffffff8121ed10-ffffffff8121ed80: slab_deactivate_memcg_cache_rcu_sched (STB_GLOBAL)
5.0
: ✅void slab_deactivate_memcg_cache_rcu_sched(struct kmem_cache *s, void (*deact_fn)(struct kmem_cache *));
Collision: Unique Global
Inline: No
Transformation: False
Instances:
In mm/slab_common.c (ffffffff81231cf0)
Location: mm/slab_common.c:740
Inline: False
Direct callers:
- mm/slub.c:__kmemcg_cache_deactivate
Symbols:
ffffffff81231cf0-ffffffff81231d60: slab_deactivate_memcg_cache_rcu_sched (STB_GLOBAL)
5.3
: Absent ⚠️
5.4
: Absent ⚠️
5.8
: Absent ⚠️
5.11
: Absent ⚠️
5.13
: Absent ⚠️
5.15
: Absent ⚠️
5.19
: Absent ⚠️
6.2
: Absent ⚠️
6.5
: Absent ⚠️
6.8
: Absent ⚠️
arm64
: Absent ⚠️
armhf
: Absent ⚠️
ppc64el
: Absent ⚠️
riscv64
: Absent ⚠️
aws
: Absent ⚠️
azure
: Absent ⚠️
gcp
: Absent ⚠️
lowlatency
: Absent ⚠️
Regular
4.13
and 4.15
✅
4.15
and 4.18
✅
4.18
and 5.0
✅