Skip to content

Conversation

phazei
Copy link

@phazei phazei commented Jan 24, 2025

Pull Request

Description

Fixed some misc build issues, some still remain.

Added fixes for cases where there were stale cache keys. If something didn't exist in the filesystem and the key existed, it was just broken. Now it checks to make sure the file exists, if not it clears the key from memoryCache and that also clears the key from cachePolicy.

Added new ProxyCacheManager methods:
get memoryCache() - returns memoryCache for user use
clearMemoryCache() - only clears memory cache
clearCache() - clears memory cache and deletes disk files
removeCachedVideo(url) - removes a single url from memory and disk

Added to useAsyncCache:
removeVideoFromCache(url) to access from a component where the url may have been set. If anything happens to get corrupted this makes it possible to delete the cache and start over, previously it was just permanently broken.

Fixed some cases in LFUPolicy where onEvict could fail if they all had equal frequency, now it falls back to first cache entry entered.

Added a new CachePolicy: LFUSizePolicy
This works similarly to LFUPolicy, but when initializing it, the number is the size in MB for the cache. Now you can control how big the cache will get. It will grab a file list and delete the least frequented ones, falling back to the file with the oldest timestamp if that doesn't work. Additionally had to add more calls to onEvict since it wasn't ever called after first adding the files when they were small since they hadn't streamed much yet. Now it's also called on memoryCache get in addition to the put, extra calls shouldn't matter for other policies. Additionally, added an isEvicting check to onEvict so if multiple video streams were running at once, it didn't attempt to do evictions at the same time.

Type of Change

  • Bug Fix
  • New Feature
  • Enhancement
  • Documentation Update
  • Other (please specify)

Checklist

  • I have read the contribution guidelines and followed the process.
  • My code follows the coding standards of this project.
  • I have added unit tests for my changes/ I have integrate in example project.
  • I have updated the documentation accordingly.
  • The code builds and passes all existing tests locally.

The example project doesn't build, it's too old and hasn't been updated in too long. eslint and other checks and ts builds just fine. I added as many tests as already existed.

Summary by CodeRabbit

  • New Features

    • Introduced a new cache eviction policy that combines least frequently used (LFU) strategy with size constraints.
    • Added the ability to clear the entire memory cache or remove individual cached videos.
    • Added methods for listing file statistics and clearing cache policies.
    • Exposed new cache management utilities for improved cache control.
  • Improvements

    • Enhanced cache eviction logic for better consistency and efficiency.
    • Increased the minimum iOS deployment target to iOS 12.4.
    • Updated the "react-native" dependency to version 0.72.17.
  • Bug Fixes

    • Improved consistency between memory cache and file system by cleaning up missing cache files automatically.

@phazei
Copy link
Author

phazei commented Jul 9, 2025

HELLO???

@nguyenvanphituoc
Copy link
Owner

Hi @phazei , sorry to lately reply. I will take a look and process it ASAP.

Tks for your contribution

@nguyenvanphituoc nguyenvanphituoc requested a review from Copilot July 31, 2025 08:43
@nguyenvanphituoc nguyenvanphituoc self-assigned this Jul 31, 2025
@nguyenvanphituoc nguyenvanphituoc added the enhancement New feature or request label Jul 31, 2025
Copy link

coderabbitai bot commented Jul 31, 2025

Walkthrough

This update introduces new cache management and eviction features, including a size-based LFU policy, explicit cache clearing and removal methods, and enhanced synchronization between memory and file system caches. Several interfaces and classes are extended with methods for clearing caches and removing entries. Minor dependency and configuration updates are also included.

Changes

Cohort / File(s) Change Summary
Dependency and Configuration Updates
example/package.json, package.json, react-native-cache-video.podspec
Updated react-native dependency from 0.72.6 to 0.72.17 and raised the minimum iOS deployment target from 11.0 to 12.4 in the podspec.
Cache Hook Enhancements
src/Hooks/useCache.ts
Added removeVideoFromCache async function to the useAsyncCache hook, enabling removal of cached videos by URL and updating internal state accordingly.
File System Utilities
src/Libs/fileSystem.ts
Added getStatisticList method to FileSystemManager for retrieving directory status lists.
Memory Cache Free Policy
src/Provider/MemoryCacheFreePolicy.ts
Removed unnecessary constructor bindings, added clear() and removeEntry(_key: string) methods as empty stubs.
Memory Cache LFU Policy
src/Provider/MemoryCacheLFUPolicy.ts
Added clear() and removeEntry(key: string) methods, improved eviction logic to handle stale entries and fallback to oldest item eviction, removed unused bindings.
Memory Cache LFU Size Policy (New)
src/Provider/MemoryCacheLFUSizePolicy.ts
Introduced LFUSizePolicy class implementing LFU eviction with size constraints, frequency tracking, and file system integration.
Memory Cache Provider Enhancements
src/Provider/MemoryCacheProvider.ts
Updated get, put, and syncCache methods to coordinate with cache policy, added clear() method for full cache and metadata reset.
PreCacheProvider Constructor Cleanup
src/Provider/PreCacheProvider.ts
Removed redundant .bind(this) calls from constructor.
Provider Exports
src/Provider/index.ts
Exported MemoryCacheLFUSizePolicy module.
Cache Manager Improvements
src/ProxyCacheManager.ts
Removed method bindings in constructor, added getter for memory cache, implemented clearMemoryCache, clearCache, and removeCachedVideo methods, improved cache-file consistency checks.
Type Definitions
src/types/type.d.ts
Extended MemoryCachePolicyInterface and MemoryCacheInterface with clear and removeEntry methods, updated onEvict signature.

Sequence Diagram(s)

Video Cache Removal Flow

sequenceDiagram
  participant App
  participant useAsyncCache
  participant cacheManager
  participant MemoryCache
  participant FileSystem

  App->>useAsyncCache: removeVideoFromCache(url)
  useAsyncCache->>cacheManager: removeCachedVideo(url)
  cacheManager->>MemoryCache: syncCache(url)
  cacheManager->>FileSystem: remove file (if exists)
  useAsyncCache->>useAsyncCache: clear local state if url matches
Loading

Memory Cache Eviction with Size Policy

sequenceDiagram
  participant MemoryCacheProvider
  participant LFUSizePolicy
  participant FileSystemManager

  MemoryCacheProvider->>LFUSizePolicy: onEvict(cache, delegate, triggerKey)
  LFUSizePolicy->>FileSystemManager: getStatisticList(cacheDir)
  LFUSizePolicy->>MemoryCacheProvider: remove least frequently used entries
  LFUSizePolicy->>FileSystemManager: remove files as needed
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~35 minutes

Poem

A bunny with cache on its mind,
Hopped through the code to see what it’d find.
With LFU and size, now the cache is so wise,
Old videos gone with a flick of its paws.
Clear, remove, and tidy anew—
The warren runs faster, thanks to this crew!
🐇✨

Note

⚡️ Unit Test Generation is now available in beta!

Learn more here, or try it out under "Finishing Touches" below.

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds cache management capabilities and introduces a new size-based cache policy. It fixes several build issues and improves cache reliability by handling stale cache entries.

  • Added new cache management methods to clear cache and remove individual cache entries
  • Introduced LFUSizePolicy for managing cache size in MB instead of number of entries
  • Fixed stale cache handling where cached keys existed but files were missing from filesystem

Reviewed Changes

Copilot reviewed 12 out of 15 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
src/ProxyCacheManager.ts Added cache management methods and stale cache cleanup logic
src/Provider/MemoryCacheLFUSizePolicy.ts New cache policy that manages cache by size in MB
src/Provider/MemoryCacheProvider.ts Enhanced with clear method and eviction improvements
src/Provider/MemoryCacheLFUPolicy.ts Fixed fallback eviction when all items have equal frequency
src/Hooks/useCache.ts Added removeVideoFromCache hook for component-level cache management
package.json Updated React Native version from 0.72.6 to 0.72.17

);

let totalSize = files.reduce(
(sum, file) => sum + parseInt(file.size as unknown as string, 10),
Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using parseInt on file.size with type casting as unknown as string is unsafe. If file.size is already a number, this will return NaN. Consider checking the type first or using Number(file.size) instead.

Suggested change
(sum, file) => sum + parseInt(file.size as unknown as string, 10),
(sum, file) => sum + Number(file.size),

Copilot uses AI. Check for mistakes.

Comment on lines +78 to +79
// console.log('::::::::::::::::: REFERENCE_BIT', this.referenceBit);
// console.log('::::::::::::::::: CACHE', Object.fromEntries(cache));
Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove commented-out console.log statements. These debug logs should not be committed to production code.

Suggested change
// console.log('::::::::::::::::: REFERENCE_BIT', this.referenceBit);
// console.log('::::::::::::::::: CACHE', Object.fromEntries(cache));

Copilot uses AI. Check for mistakes.

Comment on lines +93 to +95
// console.log('::::::::::::: COUNT', count, ':::');
// console.log('::::::::::::: EVICTKEY', count, evictedKey, ':::');
// console.log('::::::::::::: FILES', count, files.length, ':::');
Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove commented-out console.log statements. These debug logs should not be committed to production code.

Suggested change
// console.log('::::::::::::: COUNT', count, ':::');
// console.log('::::::::::::: EVICTKEY', count, evictedKey, ':::');
// console.log('::::::::::::: FILES', count, files.length, ':::');

Copilot uses AI. Check for mistakes.


/**
*
- LFUSize (Least Recently Used by Size): The least recently used item is evicted. This bases the eviction check on cache directory size in MB.
Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment incorrectly describes LFUSize as 'Least Recently Used by Size' when it should be 'Least Frequently Used by Size' based on the class name and implementation.

Copilot uses AI. Check for mistakes.

: this.referenceBit[key]! + 1;
}

async onEvict(
Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The onEvict method is called on every cache access (line 35 in MemoryCacheProvider.ts), but it performs expensive file system operations including directory scanning and file size calculations. Consider implementing a more efficient approach like periodic cleanup or size tracking.

Copilot uses AI. Check for mistakes.

Comment on lines +35 to +36
this.cachePolicy.onEvict(this.cache, this.delegate, key);

Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Calling onEvict on every cache get operation could significantly impact performance, especially for the LFUSizePolicy which performs file system operations. Consider calling onEvict only when necessary (e.g., when cache is near capacity).

Suggested change
this.cachePolicy.onEvict(this.cache, this.delegate, key);

Copilot uses AI. Check for mistakes.

return cachedKey;
} else {
// File missing - clean up cache entries
this._memoryCache?.syncCache(url);
Copy link
Preview

Copilot AI Jul 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The method name 'syncCache' is unclear. Based on the context where it's used to clean up stale cache entries, a more descriptive name like 'removeEntry' or 'cleanupStaleEntry' would be better.

Suggested change
this._memoryCache?.syncCache(url);
this._memoryCache?.cleanupStaleEntry(url);

Copilot uses AI. Check for mistakes.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (3)
src/Libs/fileSystem.ts (1)

136-142: Add method binding in constructor for consistency.

The new getStatisticList method should be bound in the constructor like other methods (line 32 shows getStatistic being bound).

Add this line in the constructor after line 32:

     this.getStatistic.bind(this);
+    this.getStatisticList.bind(this);
src/ProxyCacheManager.ts (2)

153-157: Simplify redundant null check

The explicit if check is redundant since optional chaining already handles the null/undefined case safely.

 clearMemoryCache(): void {
-  if (this._memoryCache) {
-    this._memoryCache?.clear();
-  }
+  this._memoryCache?.clear();
 }

168-191: Consider logging file deletion errors

While the logic correctly prioritizes memory cache cleanup over file deletion success, silently swallowing errors could make debugging difficult in production.

       try {
         await this.didEvictHandler(key, cachedPath);
       } catch (error) {
         // Still succeeded in cleaning cache/policy even if file deletion failed
+        console.warn(`Failed to delete cached file for ${key}:`, error);
       }
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 68d4338 and 386f668.

⛔ Files ignored due to path filters (2)
  • example/ios/Podfile.lock is excluded by !**/*.lock
  • yarn.lock is excluded by !**/yarn.lock, !**/*.lock
📒 Files selected for processing (13)
  • example/package.json (1 hunks)
  • package.json (1 hunks)
  • react-native-cache-video.podspec (1 hunks)
  • src/Hooks/useCache.ts (1 hunks)
  • src/Libs/fileSystem.ts (1 hunks)
  • src/Provider/MemoryCacheFreePolicy.ts (1 hunks)
  • src/Provider/MemoryCacheLFUPolicy.ts (4 hunks)
  • src/Provider/MemoryCacheLFUSizePolicy.ts (1 hunks)
  • src/Provider/MemoryCacheProvider.ts (4 hunks)
  • src/Provider/PreCacheProvider.ts (0 hunks)
  • src/Provider/index.ts (1 hunks)
  • src/ProxyCacheManager.ts (3 hunks)
  • src/types/type.d.ts (2 hunks)
💤 Files with no reviewable changes (1)
  • src/Provider/PreCacheProvider.ts
🧰 Additional context used
🧬 Code Graph Analysis (2)
src/Provider/MemoryCacheLFUPolicy.ts (1)
src/Utils/constants.ts (1)
  • SECOND_CHANCE_TO_COUNT (5-5)
src/ProxyCacheManager.ts (2)
src/Utils/util.ts (1)
  • getCacheKey (188-196)
src/Utils/constants.ts (1)
  • KEY_PREFIX (6-6)
🔇 Additional comments (19)
example/package.json (1)

14-14: LGTM! Routine patch version update.

The React Native patch version update from 0.72.6 to 0.72.17 aligns with the main package.json and should include bug fixes and improvements.

package.json (1)

69-69: LGTM! Consistent dependency update.

The React Native patch version update maintains consistency with the example package and should provide bug fixes and improvements.

react-native-cache-video.podspec (1)

14-14: iOS deployment target aligns with React Native 0.72.17

  • react-native-cache-video.podspec (line 14): s.platforms = { :ios => "12.4" } matches the minimum iOS version specified in nearly all React Native 0.72.17 podspecs.
  • No further code changes are required.

Please ensure you document this bump to iOS 12.4 as a breaking change in your release notes or README so users on older iOS versions are aware.

src/Provider/index.ts (1)

2-2: LGTM! Clean export addition for new cache policy.

The export addition properly exposes the new MemoryCacheLFUSizePolicy class and follows the existing export pattern.

src/Provider/MemoryCacheFreePolicy.ts (1)

10-14: LGTM! Clean implementation of interface requirements.

The empty stub methods for clear() and removeEntry() are appropriate for the FreePolicy since it doesn't maintain any internal state or tracking data. The simplified constructor is also a good cleanup.

src/Hooks/useCache.ts (1)

62-79: Well-implemented cache removal function.

The removeVideoFromCache function properly integrates with the cache manager and handles local state cleanup when the removed video matches the currently cached video. The memoization and dependency management are correct.

src/Provider/MemoryCacheLFUPolicy.ts (3)

34-40: LGTM! Clean state management methods.

The clear() and removeEntry() methods properly manage the internal referenceBit tracking state.


89-99: Smart fallback eviction strategy.

The fallback to evict the oldest item when no suitable frequency-based candidate is found ensures the cache capacity constraint is always maintained.


68-82: Excellent improvements to eviction logic.

The stale key cleanup prevents memory leaks, and the enhanced frequency comparison properly handles the SECOND_CHANCE_TO_COUNT special case. However, there's a subtle issue with the eviction logic.

The condition on line 77 freq !== SECOND_CHANCE_TO_COUNT || lfuKey === null may not work as intended. If lfuKey is already set to a non-SECOND_CHANCE_TO_COUNT item, this condition could still replace it with a SECOND_CHANCE_TO_COUNT item when a better candidate exists.

Consider this fix:

-      if (freq && freq < minFreq) {
-        // Consider SECOND_CHANCE_TO_COUNT items if nothing else found
-        if (freq !== SECOND_CHANCE_TO_COUNT || lfuKey === null) {
-          minFreq = freq;
-          lfuKey = key;
-        }
-      }
+      if (freq && freq < minFreq && freq !== SECOND_CHANCE_TO_COUNT) {
+        minFreq = freq;
+        lfuKey = key;
+      } else if (freq === SECOND_CHANCE_TO_COUNT && lfuKey === null) {
+        // Only use SECOND_CHANCE_TO_COUNT items as last resort
+        minFreq = freq;
+        lfuKey = key;
+      }

Likely an incorrect or invalid review comment.

src/types/type.d.ts (2)

52-58: Well-designed interface enhancements.

The updated onEvict signature with optional triggerKey parameter and the new clear() and removeEntry() methods provide the necessary contracts for enhanced cache management while maintaining backward compatibility.


72-72: Good addition to cache interface.

The clear() method addition to MemoryCacheInterface provides a consistent API for cache clearing operations.

src/Provider/MemoryCacheProvider.ts (4)

35-35: Good integration with enhanced eviction logic.

Passing the current key to onEvict enables cache policies to make more informed eviction decisions.


49-49: Consistent eviction integration.

The put method properly passes the trigger key to enable enhanced eviction logic.


62-64: Essential cleanup integration.

Calling cachePolicy.removeEntry(key) during cache removal prevents memory leaks by cleaning up policy tracking data.


104-110: Well-coordinated cache clearing.

The clear() method properly clears both the internal cache map and the policy's tracking data, ensuring complete cache reset.

src/ProxyCacheManager.ts (3)

46-58: Good simplification of the constructor

Removing explicit .bind(this) calls is appropriate since class methods in TypeScript are automatically bound to the instance context.


106-117: Excellent fix for stale cache entries

This enhancement properly handles the case where cache keys exist in memory but the corresponding files are missing on the filesystem, maintaining consistency between the two layers.


159-166: Well-implemented cache clearing logic

The method correctly clears both memory cache and disk files in the proper order, maintaining consistency across both storage layers.

src/Provider/MemoryCacheLFUSizePolicy.ts (1)

42-55: Well-implemented hybrid LFU/LRU access tracking

The method effectively combines LFU frequency tracking with LRU ordering by repositioning accessed items in the Map. The use of SECOND_CHANCE_TO_COUNT for new entries is a good design choice.

Comment on lines +26 to +32
constructor(capacityMB: number) {
this.referenceBit = {} as {
[key in string]: number;
};
this.capacityBytes = capacityMB * 1024 * 1024; // Convert MB to bytes
this.storage = new FileSystemManager();
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add validation for capacity parameter and consider FileSystemManager injection

The constructor should validate the capacityMB parameter and consider dependency injection for FileSystemManager.

-constructor(capacityMB: number) {
+constructor(capacityMB: number, storage?: FileSystemManager) {
+  if (capacityMB <= 0) {
+    throw new Error('Capacity must be greater than 0 MB');
+  }
   this.referenceBit = {} as {
     [key in string]: number;
   };
   this.capacityBytes = capacityMB * 1024 * 1024; // Convert MB to bytes
-  this.storage = new FileSystemManager();
+  this.storage = storage || new FileSystemManager();
 }
🤖 Prompt for AI Agents
In src/Provider/MemoryCacheLFUSizePolicy.ts around lines 26 to 32, add
validation to ensure the capacityMB parameter is a positive number and throw an
error if it is invalid. Modify the constructor to accept an optional
FileSystemManager instance as a parameter for dependency injection, defaulting
to a new instance if none is provided. This improves robustness and testability.

Comment on lines +57 to +133
async onEvict(
cache: Map<string, any>,
delegate?: MemoryCacheDelegate<any>,
triggerKey?: string
) {
if (this.isEvicting) {
return; // Another eviction is in progress
}
try {
this.isEvicting = true;

// Get current directory size
const files = await this.storage.getStatisticList(
this.storage.getBucketFolder(FileBucket.cache)
);

let totalSize = files.reduce(
(sum, file) => sum + parseInt(file.size as unknown as string, 10),
0
);

// console.log('::::::::::::::::: REFERENCE_BIT', this.referenceBit);
// console.log('::::::::::::::::: CACHE', Object.fromEntries(cache));

// Keep evicting least frequently used items until we're under capacity
let count = 0;
while (totalSize > this.capacityBytes) {
count++;

// Don't evict if it's among last files, could be single giant file
// Don't try more than 10 files at a time per eviction check.
if (files.length <= 2 || count > 10) {
break;
}

const evictedKey = this.findLFUKey(files, cache, triggerKey);
// console.log('::::::::::::: COUNT', count, ':::');
// console.log('::::::::::::: EVICTKEY', count, evictedKey, ':::');
// console.log('::::::::::::: FILES', count, files.length, ':::');

if (!evictedKey) {
// Nothing left to evict or only the trigger file remains
break;
}

const cachedPath = cache.get(evictedKey);
// console.log('::::::::::::: CACHEPATH', count, cachedPath, ':::');
if (!cachedPath) {
delete this.referenceBit[evictedKey]; // Clean up stale reference
continue;
}

// Find the file size we're about to evict
const fileToEvict = files.find((f) => cachedPath.includes(f.filename));
if (!fileToEvict) {
// File doesn't exist on disk, clean up stale reference
cache.delete(evictedKey);
delete this.referenceBit[evictedKey];
continue;
}

// Evict the file
cache.delete(evictedKey);
delete this.referenceBit[evictedKey];
await delegate?.didEvictHandler(evictedKey, cachedPath);

// Update our running total
totalSize -= fileToEvict.size;
// file must exist or -1 will remove last item
files.splice(files.indexOf(fileToEvict), 1);

// console.log('::::::::::::: NewSize:', count, '||', totalSize, ':::');
}
} finally {
this.isEvicting = false;
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix potential array manipulation issue and remove debug logs

The eviction logic is well-structured, but there's a potential issue with array manipulation and debug code should be removed.

       // Update our running total
       totalSize -= fileToEvict.size;
-      // file must exist or -1 will remove last item
-      files.splice(files.indexOf(fileToEvict), 1);
+      const fileIndex = files.indexOf(fileToEvict);
+      if (fileIndex !== -1) {
+        files.splice(fileIndex, 1);
+      }

Also remove the commented console.log statements on lines 78-79, 93-95, and 103, 128 for cleaner production code.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async onEvict(
cache: Map<string, any>,
delegate?: MemoryCacheDelegate<any>,
triggerKey?: string
) {
if (this.isEvicting) {
return; // Another eviction is in progress
}
try {
this.isEvicting = true;
// Get current directory size
const files = await this.storage.getStatisticList(
this.storage.getBucketFolder(FileBucket.cache)
);
let totalSize = files.reduce(
(sum, file) => sum + parseInt(file.size as unknown as string, 10),
0
);
// console.log('::::::::::::::::: REFERENCE_BIT', this.referenceBit);
// console.log('::::::::::::::::: CACHE', Object.fromEntries(cache));
// Keep evicting least frequently used items until we're under capacity
let count = 0;
while (totalSize > this.capacityBytes) {
count++;
// Don't evict if it's among last files, could be single giant file
// Don't try more than 10 files at a time per eviction check.
if (files.length <= 2 || count > 10) {
break;
}
const evictedKey = this.findLFUKey(files, cache, triggerKey);
// console.log('::::::::::::: COUNT', count, ':::');
// console.log('::::::::::::: EVICTKEY', count, evictedKey, ':::');
// console.log('::::::::::::: FILES', count, files.length, ':::');
if (!evictedKey) {
// Nothing left to evict or only the trigger file remains
break;
}
const cachedPath = cache.get(evictedKey);
// console.log('::::::::::::: CACHEPATH', count, cachedPath, ':::');
if (!cachedPath) {
delete this.referenceBit[evictedKey]; // Clean up stale reference
continue;
}
// Find the file size we're about to evict
const fileToEvict = files.find((f) => cachedPath.includes(f.filename));
if (!fileToEvict) {
// File doesn't exist on disk, clean up stale reference
cache.delete(evictedKey);
delete this.referenceBit[evictedKey];
continue;
}
// Evict the file
cache.delete(evictedKey);
delete this.referenceBit[evictedKey];
await delegate?.didEvictHandler(evictedKey, cachedPath);
// Update our running total
totalSize -= fileToEvict.size;
// file must exist or -1 will remove last item
files.splice(files.indexOf(fileToEvict), 1);
// console.log('::::::::::::: NewSize:', count, '||', totalSize, ':::');
}
} finally {
this.isEvicting = false;
}
}
// Update our running total
totalSize -= fileToEvict.size;
const fileIndex = files.indexOf(fileToEvict);
if (fileIndex !== -1) {
files.splice(fileIndex, 1);
}
🤖 Prompt for AI Agents
In src/Provider/MemoryCacheLFUSizePolicy.ts between lines 57 and 133, remove all
commented out console.log debug statements on lines 78-79, 93-95, 103, and 128
to clean up the production code. Additionally, ensure that when removing items
from the files array during eviction, the splice operation correctly targets the
intended index to avoid potential array manipulation issues.

Comment on lines +135 to +182
private findLFUKey(
files: Array<any>,
cache: Map<string, any>,
excludeKey?: string
): string | null {
let minFreq = Number.MAX_VALUE;
let lfuKey: string | null = null;

for (const key in this.referenceBit) {
// Skip the file that triggered eviction
if (key === excludeKey) continue;

const freq = this.referenceBit[key];
if (freq && freq < minFreq) {
if (freq !== SECOND_CHANCE_TO_COUNT || lfuKey === null) {
minFreq = freq;
lfuKey = key;
}
}
}

// If all items have equal frequency, use the oldest file
if (!lfuKey && Object.keys(this.referenceBit).length > 0) {
const eligibleFiles = files.filter((file) => {
if (excludeKey) {
const excludePath = cache.get(excludeKey);
return !excludePath?.includes(file.filename);
}
return true;
});

// Find the oldest file
const oldestFile = eligibleFiles.reduce((oldest, current) => {
return oldest.lastModified < current.lastModified ? oldest : current;
});

// Find the referenceBit key that corresponds to this file
// Find which cache entry has this filename
lfuKey =
Array.from(cache.entries()).find(([_, path]) =>
path.includes(oldestFile.filename)
)?.[0] ||
cache.keys().next().value || // fallback to first (oldest) key
null;
}

return lfuKey;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add safety checks and simplify complex fallback logic

The method needs safety checks for edge cases and the fallback logic is overly complex.

       });

+      if (eligibleFiles.length === 0) {
+        return null;
+      }
+
       // Find the oldest file
       const oldestFile = eligibleFiles.reduce((oldest, current) => {
         return oldest.lastModified < current.lastModified ? oldest : current;
       });

Consider simplifying the complex fallback chain on lines 173-178 by extracting it to a separate method for better readability and maintainability.

🤖 Prompt for AI Agents
In src/Provider/MemoryCacheLFUSizePolicy.ts between lines 135 and 182, add
safety checks to handle cases where eligibleFiles might be empty before calling
reduce to avoid runtime errors. Simplify the fallback logic on lines 173-178 by
extracting the chain that finds lfuKey from cache entries or keys into a
separate helper method to improve readability and maintainability.

Comment on lines +189 to +196
set dataSource(data: { [key in string]: number }) {
const newDataSource = mergeWithCustomCondition(
this.referenceBit,
data,
mergeLargerNumber
);
this.referenceBit = newDataSource;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add validation for dataSource setter

The setter should validate the input data structure to prevent corruption of the internal state.

 set dataSource(data: { [key in string]: number }) {
+  // Validate that all values are non-negative numbers
+  for (const key in data) {
+    if (typeof data[key] !== 'number' || data[key] < 0) {
+      throw new Error(`Invalid frequency value for key ${key}: ${data[key]}`);
+    }
+  }
   const newDataSource = mergeWithCustomCondition(
     this.referenceBit,
     data,
     mergeLargerNumber
   );
   this.referenceBit = newDataSource;
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
set dataSource(data: { [key in string]: number }) {
const newDataSource = mergeWithCustomCondition(
this.referenceBit,
data,
mergeLargerNumber
);
this.referenceBit = newDataSource;
}
set dataSource(data: { [key in string]: number }) {
// Validate that all values are non-negative numbers
for (const key in data) {
if (typeof data[key] !== 'number' || data[key] < 0) {
throw new Error(`Invalid frequency value for key ${key}: ${data[key]}`);
}
}
const newDataSource = mergeWithCustomCondition(
this.referenceBit,
data,
mergeLargerNumber
);
this.referenceBit = newDataSource;
}
🤖 Prompt for AI Agents
In src/Provider/MemoryCacheLFUSizePolicy.ts around lines 189 to 196, the
dataSource setter lacks validation for the input data structure, risking
internal state corruption. Add validation logic to check that the input is an
object with string keys and number values before merging. If the input is
invalid, throw an error or handle it gracefully to prevent corrupting
this.referenceBit.

Comment on lines +60 to 62
get memoryCache() {
return this._memoryCache;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Reconsider exposing the internal memory cache directly

Exposing the private _memoryCache through a public getter breaks encapsulation and could allow external code to bypass cache management logic. Consider providing specific methods for cache operations instead of exposing the entire cache object.

🤖 Prompt for AI Agents
In src/ProxyCacheManager.ts around lines 60 to 62, the public getter exposes the
private _memoryCache directly, breaking encapsulation. Remove the public getter
and instead implement specific methods that perform controlled cache operations
(e.g., get, set, delete) to interact with the cache while preserving
encapsulation and internal logic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants