• 0 Posts
  • 1.08K Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle







  • So, basically, the trick to setting this up in Caddy is more one of not doing anything. Caddy is so much smarter than Nginx that it just figures out all this stuff for you.

    So this:

    # Notes Server - With WebSocket
    server {
        listen 80;
        server_name notes.domain.com;
        return 301 https://$host$request_uri;
    }
    
    server {
        listen 443 ssl;
        server_name notes.domain.com;
    
        ssl_certificate /etc/letsencrypt/live/notes.domain.com/fullchain.pem;
        ssl_certificate_key /etc/letsencrypt/live/notes.domain.com/privkey.pem;
        include /etc/letsencrypt/options-ssl-nginx.conf;
        ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
    
        location / {
            proxy_pass http://localhost:5264/;
            proxy_http_version 1.1;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection "upgrade";
            proxy_set_header Host $host;
            proxy_read_timeout 3600;
            proxy_send_timeout 3600;
        }
    }
    

    in Caddy becomes this:

    auth.domain.com {
            reverse_proxy IP_ADDRESS:8264
    }
    

    Yeah. This is why I love Caddy.

    In the end I only had to include a couple of the header modifiers to get everything working. So my finished file looked like this:

    auth.domain.com {
            reverse_proxy IP_ADDRESS:8264 {
                    header_up Host $host
                    header_up X-Real-IP $remote_addr
            }
    }
    
    notes.domain.com {
            reverse_proxy IP_ADDRESS:5264
    }
    
    events.domain.com {
            reverse_proxy IP_ADDRESS:7264
    }
    
    mono.domain.com {
            reverse_proxy IP_ADDRESS:6264
            header / Cache-Control "public, no-transform"
            header / X-Cache-Status $upstream_cache_status
    }
    

    Obviously, update “domain.com” and “IP_ADDRESS” to the appropriate values. I’m actually not even 100% sure that all of that is necessary, but my setup seems to be working, including the monograph server.

    One very important aside though; in your .env file, don’t do this:

    AUTH_SERVER_PUBLIC_URL=https://auth.domain.com/
    NOTESNOOK_APP_PUBLIC_URL=https://notes.domain.com/
    MONOGRAPH_PUBLIC_URL=https://mono.domain.com/
    ATTACHMENTS_SERVER_PUBLIC_URL=https://files.domain.com/
    

    Those trailing slashes will mess everything up. Strip them off so it looks like this:

    AUTH_SERVER_PUBLIC_URL=https://auth.domain.com/
    NOTESNOOK_APP_PUBLIC_URL=https://notes.domain.com/
    MONOGRAPH_PUBLIC_URL=https://mono.domain.com/
    ATTACHMENTS_SERVER_PUBLIC_URL=https://files.domain.com/
    

    Took me a while to work that one out.

    I might still need to tweak some of this. I’m getting an occasional “Unknown network error” in the app, but all my notes are syncing, monographs publish just fine, and generally everything else seems to work, so I’m not entirely sure what the issue is that Notesnook is trying to tell me about, or if it’s even something I need to fix.

    Edit: OK, the issue was that I didn’t have files.domain.com setup. Just directly proxying it solves one error, but creates another, so I’ll need to play with that part a little more. It’s probably down to Minio doing it’s own proxying on the backend (because it rewrites http requests at 9009 to https at 9090). Will update when I get it working. Anyway, for now everything except attachments seem to work.



  • Noted, I’ll be giving that a proper read after work. Thank you.

    Edit to add: Yeah, that pretty much mirrors my own experiences of using AI as a coding aid. Even when I was learning a new language, I found that my comprehension of the material very quickly outstripped whatever ChatGPT could provide. I’d much rather understand what I’m building because I built it myself. A lot of the time, when you use a solution someone else provided you don’t find out until much later how badly that solution held you back because it wasn’t actually the best way to tackle the problem.



  • The issue is that AI is being invested in as if it can replace jobs. That’s not an issue for anyone who wants to use it as a spellchecker, but it is an issue for the economy, for society, and for the planet, because billions of dollars of computer hardware are being built and run on the assumption that trillions of dollars of payoff will be generated.

    And correcting someone’s tone in an email is not, and will never be, a trillion dollar industry.


  • I think these are actually valid examples, albeit ones that come with a really big caveat; you’re using AI in place of a skill that you really should be learning for yourself. As an autistic IT person, I get the struggle of communicating with non-technical and neurotypical people, especially clients who you have to be extra careful with. But the reality is, you can’t always do all your communication by email. If you always rely on the AI to correct your tone or simplify your language, you’re choosing not to build an essential skill that is every bit as important to doing your job well as it is to know how to correctly configure an ACL on a Cisco managed switch.

    That said, I can also see how relying on the AI at first can be a helpful learning tool as you build those skills. There’s certainly an argument that by using tools, but paying attention to the output of those tools, you build those skills for yourself. Learning by example works. I think used in that way, there’s potentially real value there.

    Which is kind of the broader story with Gen AI overall. It’s not that it can never be useful; it’s that, at best, it can only ever aspire to “useful.” No one, yet, has demonstrated any ability to make AI “essential” and the idea that we should be investing hundreds of billions of dollars into a technology that is, on its best days, mildly useful, is sheer fucking lunacy.