I wrote an article taking PentestGPT for a spin and gave my thoughts on it. #75
Replies: 3 comments 1 reply
-
Thanks for the feedback! For the three key points you mentioned:
I agree. That's why people are using embedding together with LLMs to address this, and I'm doing some tests on that. I won't say it's promising, but memory issues might be resolved in near future. You may also want to try the 32K token GPT-4, which works amazing in some complex tasks.
Agree. Unfortunately, even fine-tuned open-source LLMs cannot outperform GPT-4. I did some testing on GLM-130B but it didn't work well. This can be a long-lasting issue until some solid open-source models come out.
I still hold the opposite opinion and believe that this can be a solid solution for more complex tasks. I do believe that with the improvement of LLMs, those issues can be solved one day. Let's see the performance of Google Sec-PaLM. Lastly, thanks so much for the feedback and the criticisms. I'll do more testing and try to make it better:) |
Beta Was this translation helpful? Give feedback.
-
Just picked up a job wherein my duties are trying to solve this very problem. I'll be in touch, maybe we can chat later. |
Beta Was this translation helpful? Give feedback.
-
I gave pentestgpt a go and found it completely and utterly useless. Seemed to get stuck in a loop where the only response it could give me was summarizing the website in question. At no point was it able to provide any type of feedback in regards to any type of pen test that it suggested I ask for. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
https://mantisek.com/taking-pentestgpt-for-a-spin
I thought it fair to include this, as I offer quite a few criticisms.
Beta Was this translation helpful? Give feedback.
All reactions