- 5 Posts
- 42 Comments
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish1·5 days agoSorry for the delay! I had a fun weekend…
The ish app seems dope. Looks like it could be useful but unfortunately Im not able to get that to work in the way we want either. The “install” prompt doesn’t work, It told me to use “—help” for more info. I did that and it said to install I use “upgrade” instead. I did that but I got back ‘package Telnet not found’. 🥲
I appreciate the help with the iOS but maybe switching to android would be best? My long term goal was to switch to android/pixel anyway because I heard those would be best for security/privacy concerns. And lucky me I have a pixel 3 I can switch everything too. I see you made another comment about how to try it on android….im going to give that a shot rn!
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish2·9 days agoHell ya! I would definitely appreciate some hand holding thru this process! This self hosting stuff is going to take a bit longer and more learning than I anticipated.
-the opening the port process makes sense. It seems like if I have a backend on my rig, I’m going to need to open a port to access that backend from a front end of a phone device. Or possibly even access that same backend on the phone device via a mirror?
-it seems like it would be easier if I could connect to the rig via an android phone instead of an iPhone. My end goal is to use Linux but I’m not ready for that step. Seems like android would be an adequate stepping stone to move to, especially if we have to go thru all this trouble with iPhone. Shall we try on the android instead? If not I’ll follow the directions you put above and report back on Saturday.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish1·10 days agoHate to say it but it didn’t work. I listed below the things I double checked. I really appreciate you helping me troubleshoot this, but it seems like I may have bitten off more than I can chew. I choose Ollama because it was supposed to be one of the easier loca AIs to set up. Do you have any recommendations for alternatives? Or do you think I should incorporate a docker or open web ui as some others have said ?
-when I went to the ollama app and entered the http://10.#.#.#:11434 , it didn’t work. Also tried the enchanted app and that didn’t work as well.
-I double checked the rule I made to make sure that was inputted properly. The 10.0.0.0/24 for the local and remote ip addresses.
-the sanity check went well. The ipv4 said closed. The ipv6 said failed.
-I checked the netstat -abn thing and 0.0.0.0:11434 is still listening.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish2·10 days agoThere are 3 lines with the :11434 in them. No brackets or anything like that. -1 has 0.0.0.0 in front -2 has 10.#.#.# in front and has a foreign address that is something other than 0.0.0 -3 is like the 2nd but a slightly different foreign address
The iPhone does have a 10.#.#.# ip number that is slightly different than the PCs.
The subnet mask is 255.255.255.0
Oh yes. I’m on windows 10 as well.
I have taken a pause here while we trouble shoot the subnet mask. We’re getting close!!
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish2·11 days agoDope! This is exactly what I needed! I would say that this is a very “hand holding” explanation which is perfect because I’m starting with 0% knowledge in this field! And I learned so much already from this post and your comment!
So here’s where I’m at, -A backend is where all the weird c++ language stuff happens to generate a response from an AI. -a front end is a pretty app or webpage that takes that response and make it more digestible to the user. -agreed. I’ve seen in other posts that exposing a port on windows defender firewall is the easiest (and safest?) way to go for specifically what I’m looking for. I don’t think I need to forward a port as that would be for more remote access. -I went to the whatismyipaddress website. The ipv6 was identical to one of the ones I have. The ipv4 was not identical. (But I don’t think that matters moving forward.) -I did the ipconfig in the command prompt terminal to find the info and my ipv4 is 10.blahblahblah.
- I ran netstat -abn (this is what worked to display the necessary info). I’m able to see 0.0.0.0 before the 11434! I had to go into the settings in the ollama backend app to enable “expose Ollama to the network”.
I’m ready for the next steps!
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish1·11 days agoBet. Looking into that now. Thanks!
I believe I have 11g of vram, so I should be good to run decent models from what I’ve been told by the other AIs.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish11·11 days agoSever is my rig which is running windows. Phone is iPhone.
Exposing the port is something I’ve tried to do in the past with no success! When you say, change the bind address, do I do that in the windows defender firewall in the inbound rules section?
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish1·11 days agoBackend/ front end. I see those a lot but I never got an explanation for it. In my case, the backend would be Ollama on my rig, and the front end would be me using it on my phone, whether that’s with and app or web ui. Is that correct?
I will add kobold to my list of AIs to check out in the future. Thanks!
Ollama has an app (or maybe interface is a better term for it) on windows right that I download models too. Then I can use said app to talk to the models. I believe Reins: Chat for Ollama is the app for iPhone that allows me to use my phone to chat with my models that are on the windows rig.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish2·11 days agoYes exactly! I would love to keep it on my network for now. I’ve read that “exposing a port” is something I may have to do in my windows firewall options.
Yes I have Ollama on my windows rig. But im down to try out a different one if you suggest so. TBH, im not sure if librechat has a web ui. I think accessing the LLM on my phone via web browser would be easiest. But there are apps out there like Reins and Enchanted that I could take advantage of.
For right now I just want to do whatever is easiest so I can get a better understanding of what I’m doing wrong.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish41·11 days agoOh! Also, I’m using windows on my PC. And my phone is an iPhone.
I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish1·11 days agoBet, I’ll try that when I get home tonight. If I don’t have success can I message you directly ?
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish2·11 days agoBet. I believe what you mentioned is best for accessing my LLM no matter where I am in the world, correct? If so I will try this one after I try what the other person suggested.
Thank you!
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phoneEnglish82·11 days agolol I have! They all say the same similar thing but it’s just not working for me.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•First Time Self Hoster- Need help with RadicaleEnglish2·13 days agoI thought I was stuck there but I misspoke. I made an edit to the original post. Thanks for the insight tho! In sure that’ll be helpful once I actually get there.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•First Time Self Hoster- Need help with RadicaleEnglish1·13 days agoI made an error in my original post. Please see the edit I made.
But I think I’m understanding a bit! I need to literally create a file named “/etc/radicale/config”. Then after that I need to copy/paste the configuration file/command line into said folder. Once I do that then I should be able to move onto authentication and then addresses.
BlackSnack@lemmy.zipOPto Selfhosted@lemmy.world•First Time Self Hoster- Need help with RadicaleEnglish1·13 days agoI misspoke earlier. I wasn’t having issues with the addresses part, I didn’t even make it that far. I’m stuck on authentication. I updated the original post and added a pic for clarity.
Love it, thank you! I just dusted off my raspberry pi 4 and am experimenting with it rn. Will definitely reach out to you to trouble shoot if that’s coo.
BlackSnack@lemmy.zipOPto Privacy@lemmy.ml•Yo yo! Help me choose some better private services!01·20 days agoGotta be honest, idk what half of the words you just said mean. Core count, vram… still have some learning to do.
My plan is to run ollama on my rig kinda like a server I guess. And then Use my phone to tap into that whenever I need it. From what I researched that seems doable, but will take some set up.
BlackSnack@lemmy.zipOPto Privacy@lemmy.ml•Yo yo! Help me choose some better private services!2·20 days agoNice! Good sublemmy to follow! (Is sublemmy the right word)
Thanks for the tips! I just started playing around with ollama so I think the self hosting route is next.
Tried it on the pixel and no luck…
Apparently it’s not apk install telnet, I need to type pkg instead. I did that and got the response below (not exact obviously) -get: 1 http.termux.net -get: 2 (similar to above) -fetched 246kb -58 packages can be upgraded. Run ‘apt list —upgradable to see them. -ERROR unable to locate package telnet.
So I ran the command line ‘apt list upgrade’ (is that what it’s called, a command line?) and I got a bunch of text back. Most of it is something like below… -dpkg/stable #.##.# arch64 [upgradable from #.##.#]