Navigation

    Kopano
    • Register
    • Login
    • Search
    • Categories
    • Get Official Kopano Support
    • Recent
    Statement regarding the closure of the Kopano community forum and the end of the community edition

    OOF part of kopano-dagent broken

    Kopano Groupware Core
    2
    12
    610
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • lantz
      lantz last edited by

      The OOF issue appears to remain with kopano-dagent 11.0.0.

      Jan 21 12:12:05 kopano-dagent[194]: Starting worker for LMTP request pid 194
      Jan 21 12:12:05 kopano-dagent[194]: Resolved recipient demo@example.com as user demo
      Jan 21 12:12:05 kopano-dagent[194]: Mail will be delivered in Inbox
      *** buffer overflow detected ***: /usr/sbin/kopano-dagent terminated
      Jan 21 12:12:05 kopano-dagent[194]: ----------------------------------------------------------------------
      Jan 21 12:12:05 kopano-dagent[194]: Fatal error detected. Please report all following information.
      Jan 21 12:12:05 kopano-dagent[194]: kopano-dagent 11.0.0
      Jan 21 12:12:05 kopano-dagent[194]: OS: Ubuntu 18.04.5 LTS (Linux 5.4.0-62-generic x86_64)
      Jan 21 12:12:05 kopano-dagent[194]: Thread name: kopano-dagent
      Jan 21 12:12:05 kopano-dagent[194]: Peak RSS: 16352
      Jan 21 12:12:05 kopano-dagent[194]: Pid 194 caught SIGABRT (6), out of memory or unhandled exception, traceback:
      Jan 21 12:12:05 kopano-dagent[194]: Backtrace:
      Jan 21 12:12:05 kopano-dagent[194]: f0. /usr/lib/x86_64-linux-gnu/libkcutil.so.0(+0x4cb60) [0x7f1aab339b60]
      Jan 21 12:12:05 kopano-dagent[194]: f1. /usr/lib/x86_64-linux-gnu/libkcutil.so.0(+0x320e6) [0x7f1aab31f0e6]
      Jan 21 12:12:05 kopano-dagent[194]: f2. /usr/lib/x86_64-linux-gnu/libkcutil.so.0(+0x3510d) [0x7f1aab32210d]
      Jan 21 12:12:05 kopano-dagent[194]: f3. /lib/x86_64-linux-gnu/libpthread.so.0(+0x12980) [0x7f1aaae79980]
      Jan 21 12:12:05 kopano-dagent[194]: f4. /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xc7) [0x7f1aa9f57fb7]
      Jan 21 12:12:05 kopano-dagent[194]: f5. /lib/x86_64-linux-gnu/libc.so.6(abort+0x141) [0x7f1aa9f59921]
      Jan 21 12:12:05 kopano-dagent[194]: f6. /lib/x86_64-linux-gnu/libc.so.6(+0x89967) [0x7f1aa9fa2967]
      Jan 21 12:12:05 kopano-dagent[194]: f7. /lib/x86_64-linux-gnu/libc.so.6(+0x134b8f) [0x7f1aaa04db8f]
      Jan 21 12:12:05 kopano-dagent[194]: f8. /lib/x86_64-linux-gnu/libc.so.6(+0x134bb1) [0x7f1aaa04dbb1]
      Jan 21 12:12:05 kopano-dagent[194]: f9. /lib/x86_64-linux-gnu/libc.so.6(+0x1328a0) [0x7f1aaa04b8a0]
      Jan 21 12:12:05 kopano-dagent[194]: f10. /lib/x86_64-linux-gnu/libc.so.6(__vsnprintf_chk+0x105) [0x7f1aaa04b055]
      Jan 21 12:12:05 kopano-dagent[194]: f11. /lib/x86_64-linux-gnu/libc.so.6(__snprintf_chk+0x85) [0x7f1aaa04af25]
      Jan 21 12:12:05 kopano-dagent[194]: f12. /usr/sbin/kopano-dagent(+0x139e4) [0x55a1be0059e4]
      Jan 21 12:12:05 kopano-dagent[194]: f13. /usr/sbin/kopano-dagent(+0x14df9) [0x55a1be006df9]
      Jan 21 12:12:05 kopano-dagent[194]: f14. /usr/sbin/kopano-dagent(+0x173d1) [0x55a1be0093d1]
      Jan 21 12:12:05 kopano-dagent[194]: f15. /usr/sbin/kopano-dagent(+0x19819) [0x55a1be00b819]
      Jan 21 12:12:05 kopano-dagent[194]: f16. /usr/sbin/kopano-dagent(+0x1bf66) [0x55a1be00df66]
      Jan 21 12:12:05 kopano-dagent[194]: f17. /lib/x86_64-linux-gnu/libpthread.so.0(+0x76db) [0x7f1aaae6e6db]
      Jan 21 12:12:05 kopano-dagent[194]: f18. /lib/x86_64-linux-gnu/libc.so.6(clone+0x3f) [0x7f1aaa03a71f]
      Jan 21 12:12:05 kopano-dagent[194]: Signal errno: Success, signal code: -6
      Jan 21 12:12:05 kopano-dagent[194]: Sender pid: 194, sender uid: 999, si_status: 0
      Jan 21 12:12:05 kopano-dagent[194]: Signal value: 0, faulting address: 0x3e7000000c2
      Jan 21 12:12:06 kopano-dagent[1420]: Coredump status left at system default.
      Jan 21 12:12:06 kopano-dagent[1420]: Starting kopano-dagent version 11.0.0 (pid 1420 uid 0) (LMTP mode)
      Jan 21 12:12:06 kopano-dagent[1420]: Listening on 0.0.0.0:2003 (fd 4)
      Jan 21 12:12:06 kopano-dagent[1420]: Listening on [::]:2003 (fd 5)
      Jan 21 12:12:06 kopano-dagent[1420]: Coredump status left at system default.
      Jan 21 12:12:06 kopano-dagent[1420]: Starting kopano-dagent version 11.0.0 (pid 1420 uid 999) (LMTP mode)
      Jan 21 12:12:06 kopano-dagent[1420]: Re-using fd 4 for 0.0.0.0:2003
      Jan 21 12:12:06 kopano-dagent[1420]: Re-using fd 5 for [::]:2003
      Jan 21 12:12:06 kopano-dagent[1420]: Maximum LMTP threads set to 20
      Jan 21 12:12:06 kopano-dagent[1420]: Starting statscollector
      
      Solidus 1 Reply Last reply Reply Quote 0
      • Solidus
        Solidus Kopano @lantz last edited by Solidus

        Hello lantz.

        Could you let me know if you have changed the tmp path of the dagent via the following config:

        tmp_path = your_new_path
        Or, if you set an env variable under TMP or TEMP?

        If so, could you provide me the full path you set?

        1 Reply Last reply Reply Quote 0
        • lantz
          lantz last edited by

          Hello Solidus

          Many thanks for considering my findings.

          The tmp path was not internationally changed and it appears not to be changed in the test system in question.

          Since I test kopano-dagent in a docker environment you should be able to reproduce the issue by following the procedure below:

          git clone https://github.com/mlan/docker-kopano.git
          cd docker-kopano/demo
          make init
          make app-test_lmtp
          make app-test_oof1
          make app-test_lmtp
          make app-test_oof0
          make app-test_lmtp
          make app-logs
          make app-env
          make app-sh
          
          lantz 1 Reply Last reply Reply Quote 0
          • lantz
            lantz @lantz last edited by

            internationally above should read intentionally

            1 Reply Last reply Reply Quote 0
            • lantz
              lantz last edited by

              Please accept the following additional piece of information.

              It is likely that whatever causes the issue described here was introduced after the 9th of October 2020, since OOF works just fine (in the docker environment mentioned above) using the nightly builds that was available on that date.

              Solidus 1 Reply Last reply Reply Quote 0
              • Solidus
                Solidus Kopano @lantz last edited by

                @lantz Hello lantz.

                Thanks for the extra information. I am working on reproducing your issue and will update when I have news.

                Thanks.

                1 Reply Last reply Reply Quote 0
                • Solidus
                  Solidus Kopano last edited by

                  Hello again @lantz :)

                  I have found the issue and fixed it. As soon as the PR is merged, it should be available in a nightly build. I’ll let you know once that happens. FYI, this’ll also likely be back-ported to version 10.

                  Thanks for reporting the issue!

                  1 Reply Last reply Reply Quote 0
                  • Solidus
                    Solidus Kopano last edited by

                    @lantz The fix has been merged to master. Tomorrow’s build should have it, let me know if you are still facing issues with this.

                    1 Reply Last reply Reply Quote 0
                    • lantz
                      lantz last edited by

                      Hello Solidus,

                      This is excellent news! :)

                      I will run tests tomorrow and report back here.

                      1 Reply Last reply Reply Quote 0
                      • lantz
                        lantz last edited by

                        Hello @Solidus

                        I used the nightly builds and tested OOF and the issue mentioned here appears to be solved!

                        Many thanks for your help.

                        Solidus 1 Reply Last reply Reply Quote 0
                        • Solidus
                          Solidus Kopano @lantz last edited by

                          @lantz Fantastic! :)

                          Happy to help.

                          1 Reply Last reply Reply Quote 0
                          • First post
                            Last post