zlacker

[parent] [thread] 10 comments
1. Matthi+(OP)[view] [source] 2021-10-27 20:52:24
I'm not against "copying" code. I just looked up "python build url query" The first link describes the `urllib.parse. urlencode` function which takes a dict.

So I would build the query like so:

    from urllib.parse import urlencode
    urlencode({
        "action": "query",
        "format": "json",
        ...
        "gscoord": f"{str(latitude.value)}|{str(longitude.value)}",
    })
I think this is orders of magnitude clearer code. But that's a parameter that's subjective that CoPilot can't adjust for (although it can be better).
replies(2): >>lambda+34 >>e0a74c+ld
2. lambda+34[view] [source] 2021-10-27 21:12:41
>>Matthi+(OP)
This. Code should be optimized for reading, I think this kind of code is OK for exploratory stuff, but needs to be rewritten later.
replies(1): >>snicke+bc
◧◩
3. snicke+bc[view] [source] [discussion] 2021-10-27 22:08:24
>>lambda+34
Well. Code should be optimized first for correctness, and simple string concatenation will not work for URL params.
replies(1): >>odonne+Gi
4. e0a74c+ld[view] [source] 2021-10-27 22:18:09
>>Matthi+(OP)
I'm surprised no one has suggested using `requests` considering how easy, safe and readable it is:

    >>> import requests, pprint
    >>> 
    >>> 
    >>> url = "https://en.wikipedia.org/w/api.php"
    >>> resp = requests.get(
    ...     url, 
    ...     params=dict(
    ...         action="query",
    ...         list="geosearch",
    ...         format="json",
    ...         gsradius=10000,
    ...         gscoord=f"{latitude.value}|{longitude.value}"
    ...     )
    ... )
    >>> 
    >>> pprint.pprint(resp.json())
    {'batchcomplete': '',
     'query': {'geosearch': [{'dist': 26.2,
                              'lat': 37.7868194444444,
                              'lon': -122.399905555556,
                              'ns': 0,
    ...
replies(1): >>thamer+gi
◧◩
5. thamer+gi[view] [source] [discussion] 2021-10-27 23:00:27
>>e0a74c+ld
For what it's worth, Copilot can do it.

I typed the following prompt:

    def search_wikipedia(lat, lon):
        """
        use "requests" to do a geosearch on Wikipedia and pretty-print the resulting JSON
        """
And it completed it with:

    r = requests.get('https://en.wikipedia.org/w/api.php?action=query&list=geosearch&gsradius=10000&gscoord={0}|{1}&gslimit=20&format=json'.format(lat, lon))
    pprint.pprint(r.json())
replies(3): >>odonne+Ci >>esjeon+8s >>grenoi+7b1
◧◩◪
6. odonne+Ci[view] [source] [discussion] 2021-10-27 23:03:07
>>thamer+gi
That doesn't exactly do what the guy above you was talking about, though.
◧◩◪
7. odonne+Gi[view] [source] [discussion] 2021-10-27 23:03:42
>>snicke+bc
It'll certainly work, just seems sloppy.
replies(1): >>bennyg+yo
◧◩◪◨
8. bennyg+yo[view] [source] [discussion] 2021-10-27 23:38:01
>>odonne+Gi
Plenty of edge cases there (e.g. url encoding), but I don't want to preach to the choir and rabbit hole on this minor detail.
◧◩◪
9. esjeon+8s[view] [source] [discussion] 2021-10-28 00:01:54
>>thamer+gi
It's like a junior dev who doesn't quit unnecessary code golfing. Somehow the AI is more comfortable with string-based URL manipulation, which is a straight anti-pattern.
replies(1): >>disgru+LC1
◧◩◪
10. grenoi+7b1[view] [source] [discussion] 2021-10-28 07:49:27
>>thamer+gi
That's what the rest of the thread is complaining about, it's still slapping the strings in there with basic formatting. No different than the top level approach.
◧◩◪◨
11. disgru+LC1[view] [source] [discussion] 2021-10-28 12:22:40
>>esjeon+8s
Presumably because that's what it's seen in the training data. Remember, it doesn't care about what the code does, it's just doing a search for similar looking code.
[go to top]