ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Random Thread - Anything Goes

    Water Closet
    time waster cat pics
    141
    21.5k
    9.8m
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • DashrenderD
      Dashrender @scottalanmiller
      last edited by

      @scottalanmiller said in Random Thread - Anything Goes:

      @dashrender said in Random Thread - Anything Goes:

      Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

      Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.

      absolutely true... and we're getting there, but still probably 10-20 years off from that.

      scottalanmillerS 1 Reply Last reply Reply Quote 0
      • scottalanmillerS
        scottalanmiller @Dashrender
        last edited by

        @dashrender said in Random Thread - Anything Goes:

        @scottalanmiller said in Random Thread - Anything Goes:

        @dashrender said in Random Thread - Anything Goes:

        I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.

        Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.

        so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?

        Of course it is.

        1 Reply Last reply Reply Quote 0
        • DashrenderD
          Dashrender @NerdyDad
          last edited by

          @nerdydad said in Random Thread - Anything Goes:

          @dashrender said in Random Thread - Anything Goes:

          A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

          So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

          But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

          Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

          Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

          Today? yes he would.

          scottalanmillerS 1 Reply Last reply Reply Quote 0
          • scottalanmillerS
            scottalanmiller @Dashrender
            last edited by

            @dashrender said in Random Thread - Anything Goes:

            @scottalanmiller said in Random Thread - Anything Goes:

            @dashrender said in Random Thread - Anything Goes:

            Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

            Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.

            absolutely true... and we're getting there, but still probably 10-20 years off from that.

            It'll take one accident where a company is held liable for a death that they could not have prevented and they go after someone without that technology as criminal culpable for putting such a dangerous vehicle on the road and things will change overnight. That's how this stuff works. Once it is clear that NOT having a self driving car is worse than driving drunk... people will start petitioning for all drivers to be arrested.

            DashrenderD 1 Reply Last reply Reply Quote 0
            • scottalanmillerS
              scottalanmiller @Dashrender
              last edited by

              @dashrender said in Random Thread - Anything Goes:

              @nerdydad said in Random Thread - Anything Goes:

              @dashrender said in Random Thread - Anything Goes:

              A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

              So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

              But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

              Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

              Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

              Today? yes he would.

              Because it is not a self driving car, though.

              DashrenderD 1 Reply Last reply Reply Quote 0
              • DashrenderD
                Dashrender @scottalanmiller
                last edited by

                @scottalanmiller said in Random Thread - Anything Goes:

                @dashrender said in Random Thread - Anything Goes:

                @scottalanmiller said in Random Thread - Anything Goes:

                @dashrender said in Random Thread - Anything Goes:

                Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.

                absolutely true... and we're getting there, but still probably 10-20 years off from that.

                It'll take one accident where a company is held liable for a death that they could not have prevented and they go after someone without that technology as criminal culpable for putting such a dangerous vehicle on the road and things will change overnight. That's how this stuff works. Once it is clear that NOT having a self driving car is worse than driving drunk... people will start petitioning for all drivers to be arrested.

                I hear what you're saying, but the same can and is said about guns. Cars don't kill people, people driving cars kill people.

                I do tend to agree that eventually, maybe in my lifetime, we'll see self driving on common road become a thing of the past.

                1 Reply Last reply Reply Quote 0
                • DashrenderD
                  Dashrender @scottalanmiller
                  last edited by

                  @scottalanmiller said in Random Thread - Anything Goes:

                  @dashrender said in Random Thread - Anything Goes:

                  @nerdydad said in Random Thread - Anything Goes:

                  @dashrender said in Random Thread - Anything Goes:

                  A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                  So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                  But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                  Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                  Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                  Today? yes he would.

                  Because it is not a self driving car, though.

                  It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                  scottalanmillerS 1 Reply Last reply Reply Quote 0
                  • DashrenderD
                    Dashrender @Dashrender
                    last edited by

                    @dashrender said in Random Thread - Anything Goes:

                    @scottalanmiller said in Random Thread - Anything Goes:

                    @dashrender said in Random Thread - Anything Goes:

                    I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.

                    Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.

                    so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?

                    You still haven't answer this question, @scottalanmiller

                    wirestyle22W scottalanmillerS 2 Replies Last reply Reply Quote 0
                    • wirestyle22W
                      wirestyle22 @Dashrender
                      last edited by

                      @dashrender said in Random Thread - Anything Goes:

                      @dashrender said in Random Thread - Anything Goes:

                      @scottalanmiller said in Random Thread - Anything Goes:

                      @dashrender said in Random Thread - Anything Goes:

                      I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.

                      Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.

                      so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?

                      You still haven't answer this question, @scottalanmiller

                      Until they are the only thing on the road its a choice for the driver to get this. At that point they have to accept responsibility for what happened imo.

                      1 Reply Last reply Reply Quote 0
                      • scottalanmillerS
                        scottalanmiller @Dashrender
                        last edited by

                        @dashrender said in Random Thread - Anything Goes:

                        @dashrender said in Random Thread - Anything Goes:

                        @scottalanmiller said in Random Thread - Anything Goes:

                        @dashrender said in Random Thread - Anything Goes:

                        I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.

                        Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.

                        so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?

                        You still haven't answer this question, @scottalanmiller

                        I did. You asked if it was the manufacturer and obviously the answer is yes. They alone make the driving decisions.

                        1 Reply Last reply Reply Quote 0
                        • scottalanmillerS
                          scottalanmiller @Dashrender
                          last edited by

                          @dashrender said in Random Thread - Anything Goes:

                          @scottalanmiller said in Random Thread - Anything Goes:

                          @dashrender said in Random Thread - Anything Goes:

                          @nerdydad said in Random Thread - Anything Goes:

                          @dashrender said in Random Thread - Anything Goes:

                          A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                          So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                          But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                          Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                          Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                          Today? yes he would.

                          Because it is not a self driving car, though.

                          It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                          That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.

                          NerdyDadN 1 Reply Last reply Reply Quote 0
                          • NerdyDadN
                            NerdyDad @scottalanmiller
                            last edited by

                            @scottalanmiller said in Random Thread - Anything Goes:

                            @dashrender said in Random Thread - Anything Goes:

                            @scottalanmiller said in Random Thread - Anything Goes:

                            @dashrender said in Random Thread - Anything Goes:

                            @nerdydad said in Random Thread - Anything Goes:

                            @dashrender said in Random Thread - Anything Goes:

                            A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                            So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                            But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                            Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                            Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                            Today? yes he would.

                            Because it is not a self driving car, though.

                            It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                            That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.

                            Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.

                            At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?

                            DashrenderD scottalanmillerS 3 Replies Last reply Reply Quote 0
                            • DashrenderD
                              Dashrender @NerdyDad
                              last edited by

                              @nerdydad said in Random Thread - Anything Goes:

                              @scottalanmiller said in Random Thread - Anything Goes:

                              @dashrender said in Random Thread - Anything Goes:

                              @scottalanmiller said in Random Thread - Anything Goes:

                              @dashrender said in Random Thread - Anything Goes:

                              @nerdydad said in Random Thread - Anything Goes:

                              @dashrender said in Random Thread - Anything Goes:

                              A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                              So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                              But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                              Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                              Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                              Today? yes he would.

                              Because it is not a self driving car, though.

                              It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                              That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.

                              Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.

                              At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?

                              Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
                              If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.

                              NerdyDadN scottalanmillerS 2 Replies Last reply Reply Quote 0
                              • NerdyDadN
                                NerdyDad @Dashrender
                                last edited by

                                @dashrender said in Random Thread - Anything Goes:

                                @nerdydad said in Random Thread - Anything Goes:

                                @scottalanmiller said in Random Thread - Anything Goes:

                                @dashrender said in Random Thread - Anything Goes:

                                @scottalanmiller said in Random Thread - Anything Goes:

                                @dashrender said in Random Thread - Anything Goes:

                                @nerdydad said in Random Thread - Anything Goes:

                                @dashrender said in Random Thread - Anything Goes:

                                A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                                So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                                But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                                Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                                Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                                Today? yes he would.

                                Because it is not a self driving car, though.

                                It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                                That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.

                                Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.

                                At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?

                                Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
                                If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.

                                But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.

                                DashrenderD scottalanmillerS 2 Replies Last reply Reply Quote 0
                                • NerdyDadN
                                  NerdyDad
                                  last edited by

                                  Wife just ordered the kids and her breakfast from this thing.

                                  0_1500046873950_signal-2017-07-14-103255.jpeg

                                  Said it took all of 5 minutes and was very easy to do. Automating jobs already.

                                  scottalanmillerS 1 Reply Last reply Reply Quote 1
                                  • DashrenderD
                                    Dashrender @NerdyDad
                                    last edited by

                                    @nerdydad said in Random Thread - Anything Goes:

                                    @dashrender said in Random Thread - Anything Goes:

                                    @nerdydad said in Random Thread - Anything Goes:

                                    @scottalanmiller said in Random Thread - Anything Goes:

                                    @dashrender said in Random Thread - Anything Goes:

                                    @scottalanmiller said in Random Thread - Anything Goes:

                                    @dashrender said in Random Thread - Anything Goes:

                                    @nerdydad said in Random Thread - Anything Goes:

                                    @dashrender said in Random Thread - Anything Goes:

                                    A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                                    So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                                    But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                                    Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                                    Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                                    Today? yes he would.

                                    Because it is not a self driving car, though.

                                    It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                                    That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.

                                    Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.

                                    At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?

                                    Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
                                    If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.

                                    But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.

                                    OK parallel parking is autonomous driving, can you still interfere? probably, grab the wheel, the car will likely stop. But blind spot sensors in most cars today do nothing but turn on an audible tone to alert the driver.

                                    In the case of the Telsa, the blind spot sensors (not really though, just sensors in general) will change lanes when in driving assist mode (so I understand, and will accept a correction if my understanding is false).

                                    NerdyDadN scottalanmillerS 3 Replies Last reply Reply Quote 0
                                    • NerdyDadN
                                      NerdyDad @Dashrender
                                      last edited by

                                      @dashrender said in Random Thread - Anything Goes:

                                      @nerdydad said in Random Thread - Anything Goes:

                                      @dashrender said in Random Thread - Anything Goes:

                                      @nerdydad said in Random Thread - Anything Goes:

                                      @scottalanmiller said in Random Thread - Anything Goes:

                                      @dashrender said in Random Thread - Anything Goes:

                                      @scottalanmiller said in Random Thread - Anything Goes:

                                      @dashrender said in Random Thread - Anything Goes:

                                      @nerdydad said in Random Thread - Anything Goes:

                                      @dashrender said in Random Thread - Anything Goes:

                                      A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.

                                      So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.

                                      But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.

                                      Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?

                                      Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?

                                      Today? yes he would.

                                      Because it is not a self driving car, though.

                                      It is self driving, just not fully autonomous. That's more when the steering wheel is removed.

                                      That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.

                                      Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.

                                      At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?

                                      Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
                                      If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.

                                      But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.

                                      OK parallel parking is autonomous driving, can you still interfere? probably, grab the wheel, the car will likely stop. But blind spot sensors in most cars today do nothing but turn on an audible tone to alert the driver.

                                      In the case of the Telsa, the blind spot sensors (not really though, just sensors in general) will change lanes when in driving assist mode (so I understand, and will accept a correction if my understanding is false).

                                      Not saying that your understanding is false. I thought I had a point but it has more likely just dwindles into oblivion as I don't remember what it was.

                                      1 Reply Last reply Reply Quote 0
                                      • hobbit666H
                                        hobbit666
                                        last edited by

                                        Scott is doing to many vids to keep up lol need an app to auto download new stuff to my phone for offline viewing........Just don't have time to look at the moment ..... job for next week.

                                        scottalanmillerS 1 Reply Last reply Reply Quote 1
                                        • scottalanmillerS
                                          scottalanmiller @hobbit666
                                          last edited by

                                          @hobbit666 said in Random Thread - Anything Goes:

                                          Scott is doing to many vids to keep up lol need an app to auto download new stuff to my phone for offline viewing........Just don't have time to look at the moment ..... job for next week.

                                          I've not seen one of those. Let us know what you find. I've seen them for desktop, but not for phone.

                                          hobbit666H 1 Reply Last reply Reply Quote 0
                                          • DashrenderD
                                            Dashrender
                                            last edited by

                                            does Youtube now allow offline viewing?

                                            NerdyDadN scottalanmillerS RojoLocoR 3 Replies Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 533
                                            • 534
                                            • 535
                                            • 536
                                            • 537
                                            • 1077
                                            • 1078
                                            • 535 / 1078
                                            • First post
                                              Last post